USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15
2017-05-31
AFRL-SA-WP-SR-2017-0014 USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 Daniel A. Williams...Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...Health Readiness System-Hearing Conservation Data Repository (DOEHRS-HC DR). Major command- and installation-level reports are available quarterly
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
The Nevada initiative: A risk communication Fiasco
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, J.; Solvic, P.; Mertz, C.K.
The U.S. Congress has designated Yucca Mountain, Nevada as the only potential site to be studied for the nation`s first high-level nuclear waste repository. People in Nevada strongly oppose the program, managed by the U.S. Department of Energy. Survey research shows that the public believes there are great risks from a repository program, in contrast to a majority of scientists who feel the risks are acceptably small. Delays in the repository program resulting in part from public opposition in Nevada have concerned the nuclear power industry, which collects the fees for the federal repository program and believes it needs themore » repository as a final disposal facility for its high-level nuclear wastes. To assist the repository program, the American Nuclear Energy Council (ANEC), an industry group, sponsored a massive advertising campaign in Nevada. The campaign attempted to assure people that the risks of a repository were small and that the repository studies should proceed. The campaign failed because its managers misunderstood the issues underlying the controversy, attempted a covert manipulation of public opinion that was revealed, and most importantly, lacked the public trust that was necessary to communicate credibly about the risks of a nuclear waste facility. This article describes the advertising campaign and its effects. The manner in which the ANEC campaign itself became a controversial public issue is reviewed. The advertising campaign is discussed as it relates to risk assessment and communication. 29 refs., 2 tabs.« less
Concept document of the repository-based software engineering program: A constructive appraisal
NASA Technical Reports Server (NTRS)
1992-01-01
A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.
US/German Collaboration in Salt Repository Research, Design and Operation - 13243
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steininger, Walter; Hansen, Frank; Biurrun, Enrique
2013-07-01
Recent developments in the US and Germany [1-3] have precipitated renewed efforts in salt repository investigations and related studies. Both the German rock salt repository activities and the US waste management programs currently face challenges that may adversely affect their respective current and future state-of-the-art core capabilities in rock salt repository science and technology. The research agenda being pursued by our respective countries leverages collective efforts for the benefit of both programs. The topics addressed by the US/German salt repository collaborations align well with the findings and recommendations summarized in the January 2012 US Blue Ribbon Commission on America's Nuclearmore » Future (BRC) report [4] and are consistent with the aspirations of the key topics of the Strategic Research Agenda of the Implementing Geological Disposal of Radioactive Waste Technology Platform (IGD-TP) [5]. Against this background, a revival of joint efforts in salt repository investigations after some years of hibernation has been undertaken to leverage collective efforts in salt repository research, design, operations, and related issues for the benefit of respective programs and to form a basis for providing an attractive, cost-effective insurance against the premature loss of virtually irreplaceable scientific expertise and institutional memory. (authors)« less
The Listening and Spoken Language Data Repository: Design and Project Overview
ERIC Educational Resources Information Center
Bradham, Tamala S.; Fonnesbeck, Christopher; Toll, Alice; Hecht, Barbara F.
2018-01-01
Purpose: The purpose of the Listening and Spoken Language Data Repository (LSL-DR) was to address a critical need for a systemwide outcome data-monitoring program for the development of listening and spoken language skills in highly specialized educational programs for children with hearing loss highlighted in Goal 3b of the 2007 Joint Committee…
Repository-Based Software Engineering (RBSE) program
NASA Technical Reports Server (NTRS)
1992-01-01
Support of a software engineering program was provided in the following areas: client/customer liaison; research representation/outreach; and program support management. Additionally, a list of deliverables is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harmon, K.M.; Lakey, L.T.; Leigh, I.W.
Worldwide activities related to nuclear fuel cycle and radioactive waste management programs are summarized. Several trends have developed in waste management strategy: All countries having to dispose of reprocessing wastes plan on conversion of the high-level waste (HLW) stream to a borosilicate glass and eventual emplacement of the glass logs, suitably packaged, in a deep geologic repository. Countries that must deal with plutonium-contaminated waste emphasize pluonium recovery, volume reduction and fixation in cement or bitumen in their treatment plans and expect to use deep geologic repositories for final disposal. Commercially available, classical engineering processing are being used worldwide to treatmore » and immobilize low- and intermediate-level wastes (LLW, ILW); disposal to surface structures, shallow-land burial and deep-underground repositories, such as played-out mines, is being done widely with no obvious technical problems. Many countries have established extensive programs to prepare for construction and operation of geologic repositories. Geologic media being studied fall into three main classes: argillites (clay or shale); crystalline rock (granite, basalt, gneiss or gabbro); and evaporates (salt formations). Most nations plan to allow 30 years or longer between discharge of fuel from the reactor and emplacement of HLW or spent fuel is a repository to permit thermal and radioactive decay. Most repository designs are based on the mined-gallery concept, placing waste or spent fuel packages into shallow holes in the floor of the gallery. Many countries have established extensive and costly programs of site evaluation, repository development and safety assessment. Two other waste management problems are the subject of major R and D programs in several countries: stabilization of uranium mill tailing piles; and immobilization or disposal of contaminated nuclear facilities, namely reactors, fuel cycle plants and R and D laboratories.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valvoda, Z.; Holub, J.; Kucerka, M.
1996-12-31
In the year 1993, began the Program of Development of the Spent Fuel and High Level Waste Repository in the Conditions of the Czech Republic. During the first phase, the basic concept and structure of the Program has been developed, and the basic design criteria and requirements were prepared. In the conditions of the Czech Republic, only an underground repository in deep geological formation is acceptable. Expected depth is between 500 to 1000 meters and as host rock will be granites. A preliminary variant design study was realized in 1994, that analyzed the radioactive waste and spent fuel flow frommore » NPPs to the repository, various possibilities of transportation in accordance to the various concepts of spent fuel conditioning and transportation to the underground structures. Conditioning and encapsulation of spent fuel and/or radioactive waste is proposed on the repository site. Underground disposal structures are proposed at one underground floor. The repository will have reserve capacity for radioactive waste from NPPs decommissioning and for waste non acceptable to other repositories. Vertical disposal of unshielded canisters in boreholes and/or horizontal disposal of shielded canisters is studied. As the base term of the start up of the repository operation, the year 2035 has been established. From this date, a preliminary time schedule of the Project has been developed. A method of calculating leveled and discounted costs within the repository lifetime, for each of selected 5 variants, was used for economic calculations. Preliminary expected parametric costs of the repository are about 0,1 Kc ($0.004) per MWh, produced in the Czech NPPs. In 1995, the design and feasibility study has gone in more details to the technical concept of repository construction and proposed technologies, as well as to the operational phase of the repository. Paper will describe results of the 1995 design work and will present the program of the repository development in next period.« less
A proposed application programming interface for a physical volume repository
NASA Technical Reports Server (NTRS)
Jones, Merritt; Williams, Joel; Wrenn, Richard
1996-01-01
The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacKinnon, Robert J.
2015-10-26
Under the auspices of the International Atomic Energy Agency (IAEA), nationally developed underground research laboratories (URLs) and associated research institutions are being offered for use by other nations. These facilities form an Underground Research Facilities (URF) Network for training in and demonstration of waste disposal technologies and the sharing of knowledge and experience related to geologic repository development, research, and engineering. In order to achieve its objectives, the URF Network regularly sponsors workshops and training events related to the knowledge base that is transferable between existing URL programs and to nations with an interest in developing a new URL. Thismore » report describes the role of URLs in the context of a general timeline for repository development. This description includes identification of key phases and activities that contribute to repository development as a repository program evolves from an early research and development phase to later phases such as construction, operations, and closure. This information is cast in the form of a matrix with the entries in this matrix forming the basis of the URF Network roadmap that will be used to identify and plan future workshops and training events.« less
Repository-based software engineering program
NASA Technical Reports Server (NTRS)
Wilson, James
1992-01-01
The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.
Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschaert, S.; Lesoille, S.; Bertrand, J.
2012-07-01
The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less
76 FR 81950 - Privacy Act; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...
Microsoft Repository Version 2 and the Open Information Model.
ERIC Educational Resources Information Center
Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David
1999-01-01
Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…
10 CFR 63.144 - Quality assurance program change.
Code of Federal Regulations, 2013 CFR
2013-01-01
... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2013-01-01 2013-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...
10 CFR 63.144 - Quality assurance program change.
Code of Federal Regulations, 2014 CFR
2014-01-01
... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2014-01-01 2014-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...
10 CFR 63.144 - Quality assurance program change.
Code of Federal Regulations, 2012 CFR
2012-01-01
... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2012-01-01 2012-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...
Evolution of a Digital Repository: One Institution's Experience
ERIC Educational Resources Information Center
Owen, Terry M.
2011-01-01
In this article, the development of a digital repository is examined, specifically how the focus on acquiring content for the repository has transitioned from faculty-published research to include the gray literature produced by the research centers on campus, including unpublished technical reports and undergraduate research from honors programs.…
Repository-based software engineering program: Concept document
NASA Technical Reports Server (NTRS)
1992-01-01
This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
... repository of datasets from completed studies, biospecimens, and ancillary data. The Division intends to make... Sharing Policy. The Division has established an internal committee, the Biospecimen Repository Access and Data Sharing Committee (BRADSC), to oversee the repository access and data sharing program. The purpose...
75 FR 73095 - Privacy Act of 1974; Report of New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-29
... Repository'' System No. 09-70-0587. The final rule for the Medicare and Medicaid EHR Incentive Program... primary purpose of this system, called the National Level Repository or NLR, is to collect, maintain, and... Maintenance of Data in the System The National Level Repository (NLR) contains information on eligible...
Yucca Mountain Biological Resources Monitoring Program; Annual report, FY91
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-01-01
The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize Yucca Mountain as a possible site for a geologic repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a repository. To ensure that site characterization activities (SCA) do not adversely affect the Yucca Mountain area, an environmental program has been implemented to monitor and mitigate potential impacts and to ensure that activities comply with applicable environmentalmore » regulations. This report describes the activities and accomplishments during fiscal year 1991 (FY91) for six program areas within the Terrestrial Ecosystem component of the YMP environmental program. The six program areas are Site Characterization Activities Effects, Desert Tortoises, Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vugrin, K.W.; Twitchell, Ch.A.
2008-07-01
Korea Hydro and Nuclear Power Co., Ltd. (KHNP) is an electric company in the Republic of Korea with twenty operational nuclear power plants and eight additional units that are either planned or currently under construction. Regulations require that KHNP manage the radioactive waste generated by their nuclear power plants. In the course of planning low, intermediate, and high level waste storage facilities, KHNP sought interaction with an acknowledged expert in the field of radioactive waste management and, consequently, contacted Sandia National Laboratories (SNL). KHNP has contracted with SNL to provide a year long training program on repository science. This papermore » discusses the design of the curriculum, specific plans for execution of the training program, and recommendations for smooth implementation of international training programs. (authors)« less
Collaborative Learning Utilizing a Domain-Based Shared Data Repository to Enhance Learning Outcomes
ERIC Educational Resources Information Center
Lubliner, David; Widmeyer, George; Deek, Fadi P.
2009-01-01
The objective of this study was to determine whether there was a quantifiable improvement in learning outcomes by integrating course materials in a 4-year baccalaureate program, utilizing a knowledge repository with a conceptual map that spans a discipline. Two new models were developed to provide the framework for this knowledge repository. A…
10 CFR 60.161 - Training and certification program.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Training and certification program. 60.161 Section 60.161 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Training and Certification of Personnel § 60.161 Training and certification program. DOE shall...
Credentialing Data Scientists: A Domain Repository Perspective
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Furukawa, H.
2015-12-01
A career in data science can have many paths: data curation, data analysis, metadata modeling - all of these in different commercial or scientific applications. Can a certification as 'data scientist' provide the guarantee that an applicant or candidate for a data science position has just the right skills? How valuable is a 'generic' certification as data scientist for an employer looking to fill a data science position? Credentials that are more specific and discipline-oriented may be more valuable to both the employer and the job candidate. One employment sector for data scientists are the data repositories that provide discipline-specific data services for science communities. Data science positions within domain repositories include a wide range of responsibilities in support of the full data life cycle - from data preservation and curation to development of data models, ontologies, and user interfaces, to development of data analysis and visualization tools to community education and outreach, and require a substantial degree of discipline-specific knowledge of scientific data acquisition and analysis workflows, data quality measures, and data cultures. Can there be certification programs for domain-specific data scientists that help build the urgently needed workforce for the repositories? The American Geophysical Union has recently started an initiative to develop a program for data science continuing education and data science professional certification for the Earth and space sciences. An Editorial Board has been charged to identify and develop curricula and content for these programs and to provide input and feedback in the implementation of the program. This presentation will report on the progress of this initiative and evaluate its utility for the needs of domain repositories in the Earth and space sciences.
Development of the performance confirmation program at YUCCA mountain, nevada
LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.
2006-01-01
The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.
Cancer Epidemiology Data Repository (CEDR)
In an effort to broaden access and facilitate efficient data sharing, the Epidemiology and Genomics Research Program (EGRP) has created the Cancer Epidemiology Data Repository (CEDR), a centralized, controlled-access database, where Investigators can deposit individual-level de-identified observational cancer datasets.
Sarzotti-Kelsoe, Marcella; Needham, Leila K.; Rountree, Wes; Bainbridge, John; Gray, Clive M.; Fiscus, Susan A.; Ferrari, Guido; Stevens, Wendy S.; Stager, Susan L.; Binz, Whitney; Louzao, Raul; Long, Kristy O.; Mokgotho, Pauline; Moodley, Niranjini; Mackay, Melanie; Kerkau, Melissa; McMillion, Takesha; Kirchherr, Jennifer; Soderberg, Kelly A.; Haynes, Barton F.; Denny, Thomas N.
2014-01-01
The Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium was established to determine the host and virus factors associated with HIV transmission, infection and containment of virus replication, with the goal of advancing the development of an HIV protective vaccine. Studies to meet this goal required the use of cryopreserved Peripheral Blood Mononuclear Cell (PBMC) specimens, and therefore it was imperative that a quality assurance (QA) oversight program be developed to monitor PBMC samples obtained from study participants at multiple international sites. Nine site-affiliated laboratories in Africa and the USA collected and processed PBMCs, and cryopreserved PBMC were shipped to CHAVI repositories in Africa and the USA for long-term storage. A three-stage program was designed, based on Good Clinical Laboratory Practices (GCLP), to monitor PBMC integrity at each step of this process. The first stage evaluated the integrity of fresh PBMCs for initial viability, overall yield, and processing time at the site-affiliated laboratories (Stage 1); for the second stage, the repositories determined post-thaw viability and cell recovery of cryopreserved PBMC, received from the site-affiliated laboratories (Stage 2); the third stage assessed the long-term specimen storage at each repository (Stage 3). Overall, the CHAVI PBMC QA oversight program results highlight the relative importance of each of these stages to the ultimate goal of preserving specimen integrity from peripheral blood collection to long-term repository storage. PMID:24910414
DOT National Transportation Integrated Search
2015-12-29
The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...
Working paper : the ITS cost data repository at Mitretek Systems
DOT National Transportation Integrated Search
1998-11-30
Mitretek Systems has been tasked by the Intelligent Transportation Systems (ITS) Joint Program Office (JPO) to collect available information on ITS costs and maintain the information in a cost database, which serves as the ITS Cost Data Repository. T...
The repository-based software engineering program: Redefining AdaNET as a mainstream NASA source
NASA Technical Reports Server (NTRS)
1993-01-01
The Repository-based Software Engineering Program (RBSE) is described to inform and update senior NASA managers about the program. Background and historical perspective on software reuse and RBSE for NASA managers who may not be familiar with these topics are provided. The paper draws upon and updates information from the RBSE Concept Document, baselined by NASA Headquarters, Johnson Space Center, and the University of Houston - Clear Lake in April 1992. Several of NASA's software problems and what RBSE is now doing to address those problems are described. Also, next steps to be taken to derive greater benefit from this Congressionally-mandated program are provided. The section on next steps describes the need to work closely with other NASA software quality, technology transfer, and reuse activities and focuses on goals and objectives relative to this need. RBSE's role within NASA is addressed; however, there is also the potential for systematic transfer of technology outside of NASA in later stages of the RBSE program. This technology transfer is discussed briefly.
Yucca Mountain biological resources monitoring program; Annual report FY92
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-02-01
The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize Yucca Mountain as a potential site for a geologic repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a potential repository. To ensure that site characterization activities (SCA) do not adversely affect the environment at Yucca Mountain, an environmental program has been implemented to monitor and mitigate potential impacts and ensure activities comply with applicable environmentalmore » regulations. This report describes the activities and accomplishments of EG&G Energy Measurements, Inc. (EG&G/EM) during fiscal year 1992 (FY92) for six program areas within the Terrestrial Ecosystem component of the YMP environmental program. The six program areas are Site Characterization Effects, Desert Tortoises, Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less
Partnerships against Violence: Promising Programs. Volume 1: Resource Guide.
ERIC Educational Resources Information Center
Department of Housing and Urban Development, Washington, DC.
This volume represents the first step in an effort to build a central repository of promising anti-violence programs. Part of a cooperative venture in the federal government, this resource guide draws on information stored in more than 30 Federal clearinghouses and resource centers. Included here are programs developed by government agencies,…
Organizing Diverse, Distributed Project Information
NASA Technical Reports Server (NTRS)
Keller, Richard M.
2003-01-01
SemanticOrganizer is a software application designed to organize and integrate information generated within a distributed organization or as part of a project that involves multiple, geographically dispersed collaborators. SemanticOrganizer incorporates the capabilities of database storage, document sharing, hypermedia navigation, and semantic-interlinking into a system that can be customized to satisfy the specific information-management needs of different user communities. The program provides a centralized repository of information that is both secure and accessible to project collaborators via the World Wide Web. SemanticOrganizer's repository can be used to collect diverse information (including forms, documents, notes, data, spreadsheets, images, and sounds) from computers at collaborators work sites. The program organizes the information using a unique network-structured conceptual framework, wherein each node represents a data record that contains not only the original information but also metadata (in effect, standardized data that characterize the information). Links among nodes express semantic relationships among the data records. The program features a Web interface through which users enter, interlink, and/or search for information in the repository. By use of this repository, the collaborators have immediate access to the most recent project information, as well as to archived information. A key advantage to SemanticOrganizer is its ability to interlink information together in a natural fashion using customized terminology and concepts that are familiar to a user community.
Scientific information repository assisting reflectance spectrometry in legal medicine.
Belenki, Liudmila; Sterzik, Vera; Bohnert, Michael; Zimmermann, Klaus; Liehr, Andreas W
2012-06-01
Reflectance spectrometry is a fast and reliable method for the characterization of human skin if the spectra are analyzed with respect to a physical model describing the optical properties of human skin. For a field study performed at the Institute of Legal Medicine and the Freiburg Materials Research Center of the University of Freiburg, a scientific information repository has been developed, which is a variant of an electronic laboratory notebook and assists in the acquisition, management, and high-throughput analysis of reflectance spectra in heterogeneous research environments. At the core of the repository is a database management system hosting the master data. It is filled with primary data via a graphical user interface (GUI) programmed in Java, which also enables the user to browse the database and access the results of data analysis. The latter is carried out via Matlab, Python, and C programs, which retrieve the primary data from the scientific information repository, perform the analysis, and store the results in the database for further usage.
LTPP InfoPave Release 2017: What's New
DOT National Transportation Integrated Search
2017-01-01
The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2014 CFR
2014-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2011 CFR
2011-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
NASA Technical Reports Server (NTRS)
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
2002-01-01
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.
Army Hearing Program Talking Points Calendar Year 2015
2016-12-14
outside the range of normal hearing sensitivity (greater than 25 dB), CY15 data. Data: DOEHRS-HC Data Repository , Soldiers who had a DD2215 or...1. Data: Defense Occupational and Environmental Health Readiness System-Hearing Conservation (DOEHRS-HC) Data Repository , CY15—Army Profile...Soldiers have a hearing loss that required a fit-for-duty (Readiness) evaluation: An H-3 Hearing Profile. Data: DOEHRS-HC Data Repository
The National Geological and Geophysical Data Preservation Program
NASA Astrophysics Data System (ADS)
Dickinson, T. L.; Steinmetz, J. C.; Gundersen, L. C.; Pierce, B. S.
2006-12-01
The ability to preserve and maintain geoscience data and collections has not kept pace with the growing need for accessible digital information and the technology to make it so. The Nation has lost valuable and unique geologic records and is in danger of losing much more. Many federal and state geological repositories are currently at their capacity for maintaining and storing data or samples. Some repositories are gaining additional, but temporary and substandard space, using transport containers or offsite warehouses where access is limited and storage conditions are poor. Over the past several years, there has been an increasing focus on the state of scientific collections in the United States. For example, the National Geological and Geophysical Data Preservation Program (NGGDPP) Act was passed as part of the Energy Policy Act of 2005, authorizing $30 million in funding for each of five years. The Act directs the U.S. Geological Survey to administer this program that includes a National Digital Catalog and Federal assistance to support our nation's repositories. Implementation of the Program awaits federal appropriations. The NGGDPP is envisioned as a national network of cooperating geoscience materials and data repositories that are operated independently yet guided by unified standards, procedures, and protocols for metadata. The holdings will be widely accessible through a common and mirrored Internet-based catalog (National Digital Catalog). The National Digital Catalog will tie the observations and analyses to the physical materials they come from. Our Nation's geological and geophysical data are invaluable and in some instances irreplaceable due to the destruction of outcrops, urbanization and restricted access. These data will enable the next generation of scientific research and education, enable more effective and efficient research, and may have future economic benefits through the discovery of new oil and gas accumulations, and mineral deposits.
Yucca Mountain Biological Resources Monitoring Program. Progress report, January 1994--December 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-07-01
The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize the suitability of Yucca Mountain as a potential geological repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a potential repository. To ensure that site characterization activities do not adversely affect the environment at Yucca Mountain, a program has been implemented to monitor and mitigate potential impacts and ensure activities comply with applicable environmental regulations. Thismore » report describes the activities and accomplishments of EG and G Energy Measurements, Inc. (EG and G/EM) from January 1994 through December 1994 for six program areas within the Terrestrial Ecosystem component of the environmental program for the Yucca Mountain Site Characterization Project (YMP): Site Characterization Effects, Desert Tortoises (Gopherus agassizii), Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less
Simulator sickness research program at NASA-Ames Research Center
NASA Technical Reports Server (NTRS)
Mccauley, Michael E.; Cook, Anthony M.
1987-01-01
The simulator sickness syndrome is receiving increased attention in the simulation community. NASA-Ames Research Center has initiated a program to facilitate the exchange of information on this topic among the tri-services and other interested government organizations. The program objectives are to identify priority research issues, promote efficient research strategies, serve as a repository of information, and disseminate information to simulator users.
A "Simple Query Interface" Adapter for the Discovery and Exchange of Learning Resources
ERIC Educational Resources Information Center
Massart, David
2006-01-01
Developed as part of CEN/ISSS Workshop on Learning Technology efforts to improve interoperability between learning resource repositories, the Simple Query Interface (SQI) is an Application Program Interface (API) for querying heterogeneous repositories of learning resource metadata. In the context of the ProLearn Network of Excellence, SQI is used…
DEVELOPMENT OF THE U.S. EPA HEALTH EFFECTS RESEARCH LABORATORY FROZEN BLOOD CELL REPOSITORY PROGRAM
In previous efforts, we suggested that proper blood cell freezing and storage is necessary in longitudinal studies with reduced between tests error, for specimen sharing between laboratories and for convenient scheduling of assays. e continue to develop and upgrade programs for o...
2017-07-01
AWARD NUMBER: W81XWH-16-0-DM167033 TITLE: Establishment of Peripheral Nerve Injury Data Repository to Monitor and Support Population Health...Injury Data Repository to Monitor and Support Population Health Decisions 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-16-0-DM167033 5c. PROGRAM...patient enrollment. Collected data will be utilized to 1) describe the outcomes of various PNI and 2) suggest outcomes that support population health
NASA Astrophysics Data System (ADS)
Dunagan, S. C.; Herrick, C. G.; Lee, M. Y.
2008-12-01
The Waste Isolation Pilot Plant (WIPP) is located at a depth of 655 m in bedded salt in southeastern New Mexico and is operated by the U.S. Department of Energy as a deep underground disposal facility for transuranic (TRU) waste. The WIPP must comply with the EPA's environmental regulations that require a probabilistic risk analysis of releases of radionuclides due to inadvertent human intrusion into the repository at some time during the 10,000-year regulatory period. Sandia National Laboratories conducts performance assessments (PAs) of the WIPP using a system of computer codes representing the evolution of underground repository and emplaced TRU waste in order to demonstrate compliance. One of the important features modeled in a PA is the disturbed rock zone (DRZ) surrounding the emplacement rooms in the repository. The extent and permeability of DRZ play a significant role in the potential radionuclide release scenarios. We evaluated the phenomena occurring in the repository that affect the DRZ and their potential effects on the extent and permeability of the DRZ. Furthermore, we examined the DRZ's role in determining the performance of the repository. Pressure in the completely sealed repository will be increased by creep closure of the salt and degradation of TRU waste contents by microbial activity in the repository. An increased pressure in the repository will reduce the extent and permeability of the DRZ. The reduced DRZ extent and permeability will decrease the amount of brine that is available to interact with the waste. Furthermore, the potential for radionuclide release from the repository is dependent on the amount of brine that enters the repository. As a result of these coupled biological-geomechanical-geochemical phenomena, the extent and permeability of the DRZ has a significant impact on the potential radionuclide releases from the repository and, in turn, the repository performance. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.
Evaluation of Five Sedimentary Rocks Other Than Salt for Geologic Repository Siting Purposes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croff, A.G.; Lomenick, T.F.; Lowrie, R.S.
The US Department of Energy (DOE), in order to increase the diversity of rock types under consideration by the geologic disposal program, initiated the Sedimary ROck Program (SERP), whose immediate objectiv eis to evaluate five types of secimdnary rock - sandstone, chalk, carbonate rocks (limestone and dolostone), anhydrock, and shale - to determine the potential for siting a geologic repository. The evaluation of these five rock types, together with the ongoing salt studies, effectively results in the consideration of all types of relatively impermeable sedimentary rock for repository purposes. The results of this evaluation are expressed in terms of amore » ranking of the five rock types with respect to their potential to serve as a geologic repository host rock. This comparative evaluation was conducted on a non-site-specific basis, by use of generic information together with rock evaluation criteria (RECs) derived from the DOE siting guidelines for geologic repositories (CFR 1984). An information base relevant to rock evaluation using these RECs was developed in hydrology, geochemistry, rock characteristics (rock occurrences, thermal response, rock mechanics), natural resources, and rock dissolution. Evaluation against postclosure and preclosure RECs yielded a ranking of the five subject rocks with respect to their potential as repository host rocks. Shale was determined to be the most preferred of the five rock types, with sandstone a distant second, the carbonate rocks and anhydrock a more distant third, and chalk a relatively close fourth.« less
Testing of candidate waste-package backfill and canister materials for basalt
NASA Astrophysics Data System (ADS)
Wood, M. I.; Anderson, W. J.; Aden, G. D.
1982-09-01
The Basalt Waste Isolation Project (BWIP) is developing a multiple-barrier waste package to contain high-level nuclear waste as part of an overall system (e.g., waste package, repository sealing system, and host rock) designed to isolate the waste in a repository located in basalt beneath the Hanford Site, Richland, Washington. The three basic components of the waste package are the waste form, the canister, and the backfill. An extensive testing program is under way to determine the chemical, physical, and mechanical properties of potential canister and backfill materials. The data derived from this testing program will be used to recommend those materials that most adequately perform the functions assigned to the canister and backfill.
Unified Database Development Program. Final Report.
ERIC Educational Resources Information Center
Thomas, Everett L., Jr.; Deem, Robert N.
The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…
The United States Antarctic Program Data Center (USAP-DC): Recent Developments
NASA Astrophysics Data System (ADS)
Nitsche, F. O.; Bauer, R.; Arko, R. A.; Shane, N.; Carbotte, S. M.; Scambos, T.
2017-12-01
Antarctic earth and environmental science data are highly valuable, often unique research assets. They are acquired with substantial and expensive logistical effort, frequently in areas that will not be re-visited for many years. The data acquired in support of Antarctic research span a wide range of disciplines. Historically, data management for the US Antarctic Program (USAP) has made use of existing disciplinary data centers, and the international Antarctic Master Directory (AMD) has served as a central metadata catalog linking to data files hosted in these external repositories. However, disciplinary repositories do not exist for all USAP-generated data types and often it is unclear what repositories are appropriate, leading to many datasets being served locally from scientist's websites or not available at all. The USAP Data Center (USAP-DC; www.usap-dc.org), operated as part of the Interdisciplinary Earth Data Alliance (IEDA), contributes to the broader preservation of research data acquired with funding from NSF's Office of Polar Programs by providing a repository for diverse data from the Antarctic region. USAP-DC hosts data that spans the range of Antarctic research from snow radar to volcano observatory imagery to penguin counts to meteorological model outputs. Data services include data documentation, long-term preservation, and web publication, as well as scientist support for registration of data descriptions into the AMD in fulfillment of US obligations under the International Antarctic Treaty. In Spring 2016, USAP-DC and the NSIDC began a new collaboration to consolidate data services for Antarctic investigators and to integrate the NSF-funded glaciology collection at NSIDC with the collection hosted by USAP-DC. Investigator submissions for NSF's Glaciology program now make use of USAP-DC's web submission tools, providing a uniform interface for Antarctic investigators. The tools have been redesigned to collect a broader range of metadata. Each data submission is reviewed and verified by a specialist from the USAP-DC/NSIDC team depending on disciplinary focus of the submission. A recently updated web search interface is available to search data by title, NSF program, award, dataset contributor, large scale project (e.g. WAIS Divide Ice Core) or by specifying an area in map view.
Ocean Drilling Program: Completed Legs
. Austin Leg summary Repository Wolfgang Schlager 102 14-Mar-85 25-Apr-85 Miami, Florida 418 Bermuda Rise Lisbon, Portugal 902-906 New Jersey Sea-Level Transect Peter Blum Gregory Mountain Leg summary Repository , Nova Scotia 1071-1073 Continuing the New Jersey Sea-Level Transect Mitchell J. Malone James A. Austin
High Integrity Can Design Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaber, E.L.
1998-08-01
The National Spent Nuclear Fuel Program is chartered with facilitating the disposition of DOE-owned spent nuclear fuel to allow disposal at a geologic repository. This is done through coordination with the repository program and by assisting DOE Site owners of SNF with needed information, standardized requirements, packaging approaches, etc. The High Integrity Can (HIC) will be manufactured to provide a substitute or barrier enhancement for normal fuel geometry and cladding. The can would be nested inside the DOE standardized canister which is designed to interface with the repository waste package. The HIC approach may provide the following benefits over typicalmore » canning approaches for DOE SNF. (a) It allows ready calculation and management of criticality issues for miscellaneous. (b) It segments and further isolates damaged or otherwise problem materials from normal SNF in the repository package. (c) It provides a very long term corrosion barrier. (d) It provides an extra internal pressure barrier for particulates, gaseous fission products, hydrogen, and water vapor. (e) It delays any potential release of fission products to the repository environment. (f) It maintains an additional level of fuel geometry control during design basis accidents, rock-fall, and seismic events. (g) When seal welded, it could provide the additional containment required for shipments involving plutonium content in excess of 20 Ci. (10 CFR 71.63.b) if integrated with an appropriate cask design. Long term corrosion protection is central to the HIC concept. The material selected for the HIC (Hastelloy C-22) has undergone extensive testing for repository service. The most severe theoretical interactions between iron, repository water containing chlorides and other repository construction materials have been tested. These expected chemical species have not been shown capable of corroding the selected HIC material. Therefore, the HIC should provide a significant barrier to DOE SNF dispersal long after most commercial SNF has degraded and begun moving into the repository environment.« less
NASA Astrophysics Data System (ADS)
Stall, S.
2016-02-01
Emerging data management mandates in conjunction with cross-domain international interoperability are posing new challenges for researchers and repositories. Domain repositories are serving in this critical, growing role monitoring and leading data management standards and capability within their own repository and working on mappings between repositories internationally. Leading research institutions and companies will also be important as they develop and expand data curation efforts. This landscape poses a number of challenges for developing and ensuring the use of best practices in curating research data, enabling discovery, elevating quality across diverse repositories, and helping researchers collect and organize it through the full data life cycle. This multidimensional challenge will continue to grow in complexity. The American Geophysical Union (AGU) is developing two programs to help researchers and data repositories develop and elevate best practices and address these challenges. The goal is to provide tools for the researchers and repositories, whether domain, institutional, or other, that improve performance throughout the data lifecycle across the Earth and space science community. For scientists and researchers, AGU is developing courses around handling data that can lead toward a certification in geoscience data management. Course materials will cover metadata management and collection, data analysis, integration of data, and data presentation. The course topics are being finalized by the advisory board with the first one planned to be available later this year. AGU is also developing a program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM framework within the Earth and space sciences. A data management assessment using the DMMSM involves identifying accomplishments and weaknesses compared to leading practices for data management. Recommendations can help improve quality and consistency across the community that will facilitate reuse in the data lifecycle. Through governance, quality, and architecture process areas the assessment can measure the ability for data to be discoverable and interoperable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faybishenko, Boris; Birkholzer, Jens; Sassani, David
The overall objective of the Fifth Worldwide Review (WWR-5) is to document the current state-of-the-art of major developments in a number of nations throughout the World pursuing geological disposal programs, and to summarize challenging problems and experience that have been obtained in siting, preparing and reviewing cases for the operational and long-term safety of proposed and operating nuclear waste repositories. The scope of the Review is to address current specific technical issues and challenges in safety case development along with the interplay of technical feasibility, siting, engineering design issues, and operational and post-closure safety. In particular, the chapters included inmore » the report present the following types of information: the current status of the deep geological repository programs for high level nuclear waste and low- and intermediate level nuclear waste in each country, concepts of siting and radioactive waste and spent nuclear fuel management in different countries (with the emphasis of nuclear waste disposal under different climatic conditions and different geological formations), progress in repository site selection and site characterization, technology development, buffer/backfill materials studies and testing, support activities, programs, and projects, international cooperation, and future plans, as well as regulatory issues and transboundary problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. John, C.M.
1977-04-01
An underground repository containing heat generating, High Level Waste or Spent Unreprocessed Fuel may be approximated as a finite number of heat sources distributed across the plane of the repository. The resulting temperature, displacement and stress changes may be calculated using analytical solutions, providing linear thermoelasticity is assumed. This report documents a computer program based on this approach and gives results that form the basis for a comparison between the effects of disposing of High Level Waste and Spent Unreprocessed Fuel.
SeaView: bringing EarthCube to the Oceanographer
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Diggs, S. C.; Arko, R. A.; Kinkade, D.; Shepherd, A.
2016-12-01
As new instrument types are developed, and new observational programs start, that support a growing community of "dry" oceanographers, the ability to find, access, and visualize existing data of interest becomes increasingly critical. Yet ocean data, when available, is are held in multiple data facilities, in different formats, and accessible through different pathways. This creates practical problems with integrating and working across different data sets. The SeaView project is building connections between the rich data resources in five major oceanographic data facilities - BCO-DMO, CCHDO, OBIS, OOI, and R2R* - creating a federated set of thematic data collections that are organized around common characteristics (geographic location, time, expedition, program, data type, etc.) and published online in Web Accessible Folders using standard file formats such as ODV and NetCDF. The work includes not simply reformatting data, but identifying and, where possible, addressing interoperability challenges: which common identifiers for core concepts can connect data across repositories, which terms a scientist may want to search that, if added to the data repositories, will increase discoverability; the presence of duplicate data across repositories, etc. We will present the data collections available to date, including data from the OOI Pioneer Array region, and seek scientists' input on the data types and formats they prefer, the tools they use to analyze and visualize data, and their specific recommendations for future data collections to support oceanographic science. * Biological and Chemical Oceanography Data Management Office (BCO-DMO), CLIVAR and Carbon Hydrographic Data Office (CCHDO), International Ocean Biogeographic Information System (iOBIS), Ocean Observatories Initiative (OOI), and Rolling Deck to Repository (R2R) Program.
Industrial Program of Waste Management - Cigeo Project - 13033
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butez, Marc; Bartagnon, Olivier; Gagner, Laurent
2013-07-01
The French Planning Act of 28 June 2006 prescribed that a reversible repository in a deep geological formation be chosen as the reference solution for the long-term management of high-level and intermediate-level long-lived radioactive waste. It also entrusted the responsibility of further studies and design of the repository (named Cigeo) upon the French Radioactive Waste Management Agency (Andra), in order for the review of the creation-license application to start in 2015 and, subject to its approval, the commissioning of the repository to take place in 2025. Andra is responsible for siting, designing, implementing, operating the future geological repository, including operationalmore » and long term safety and waste acceptance. Nuclear operators (Electricite de France (EDF), AREVA NC, and the French Commission in charge of Atomic Energy and Alternative Energies (CEA) are technically and financially responsible for the waste they generate, with no limit in time. They provide Andra, on one hand, with waste packages related input data, and on the other hand with their long term industrial experiences of high and intermediate-level long-lived radwaste management and nuclear operation. Andra, EDF, AREVA and CEA established a cooperation agreement for strengthening their collaborations in these fields. Within this agreement Andra and the nuclear operators have defined an industrial program for waste management. This program includes the waste inventory to be taken into account for the design of the Cigeo project and the structural hypothesis underlying its phased development. It schedules the delivery of the different categories of waste and defines associated flows. (authors)« less
Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.
2011-01-01
The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.
Multi-institutional tumor banking: lessons learned from a pancreatic cancer biospecimen repository.
Demeure, Michael J; Sielaff, Timothy; Koep, Larry; Prinz, Richard; Moser, A James; Zeh, Herb; Hostetter, Galen; Black, Jodi; Decker, Ardis; Rosewell, Sandra; Bussey, Kimberly J; Von Hoff, Daniel
2010-10-01
Clinically annotated pancreatic cancer samples are needed for progress to be made toward developing more effective treatments for this deadly cancer. As part of a National Cancer Institute-funded program project, we established a biospecimen core to support the research efforts. This article summarizes the key hurdles encountered and solutions we found in the process of developing a successful multi-institution biospecimen repository.
Evolving the Living With a Star Data System Definition
NASA Astrophysics Data System (ADS)
Otranto, J.; Dijoseph, M.; Worrall, W.
2003-04-01
NASA’s Living With a Star (LWS) Program is a space weather-focused and applications-driven research program. The LWS Program is soliciting input from the solar, space physics, space weather, and climate science communities to develop a system that enables access to science data associated with these disciplines, and advances the development of discipline and interdisciplinary findings. The LWS Program will implement a data system that builds upon the existing and planned data capture, processing, and storage components put in place by individual spacecraft missions and also inter-project data management systems, such as active archives, deep archives, and multi-mission repositories. It is technically feasible for the LWS Program to integrate data from a broad set of resources, assuming they are either publicly accessible or access is permitted by the system’s administrators. The LWS Program data system will work in coordination with spacecraft mission data systems and science data repositories, integrating them into a common data representation. This common representation relies on a robust metadata definition that provides journalistic and technical data descriptions, plus linkages to supporting data products and tools. The LWS Program intends to become an enabling resource to PIs, interdisciplinary scientists, researchers, and students facilitating both access to a broad collection of science data, as well as the necessary supporting components to understand and make productive use of the data. For the LWS Program to represent science data that is physically distributed across various ground system elements, information about the data products stored on each system is collected through a series of LWS-created active agents. These active agents are customized to interface or interact with each one of these data systems, collect information, and forward updates to a single LWS-developed metadata broker. This broker, in turn, updates a centralized repository of LWS-specific metadata. A populated LWS metadata database is a single point-of-contact that can serve all users (the science community) with a “one-stop-shop” for data access. While data may not be physically stored in an LWS-specific repository, the LWS system enables data access from wherever the data are stored. Moreover, LWS provides the user access to information for understanding the data source, format, and calibration, enables access to ancillary and correlative data products, provides links to processing tools and models associated with the data, and any corresponding findings. The LWS may also support an active archive for solar, space physics, space weather, and climate data when these data would otherwise be discarded or archived off-line. This archive could potentially serve as a backup facility for LWS missions. This plan is developed based upon input already received from the science community; the architecture is based on system developed to date that have worked well on a smaller scale. The LWS Program continues to seek constructive input from the science community, examples of both successes and failures in dealing with science data systems, and insights regarding the obstacles between the current state-of-the-practice and this vision for the LWS Program data system.
NASA Astrophysics Data System (ADS)
McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.
2010-12-01
Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-04-01
During the second half of fiscal year 1996, activities at the Yucca Mountain Site Characterization Project (Project) supported the objectives of the revised Program Plan released this period by the Office of Civilian Radioactive Waste Management of the US Department of Energy (Department). Outlined in the revised plan is a focused, integrated program of site characterization, design, engineering, environmental, and performance assessment activities that will achieve key Program and statutory objectives. The plan will result in the development of a license application for repository construction at Yucca Mountain, if the site is found suitable. Activities this period focused on twomore » of the three near-term objectives of the revised plan: updating in 1997 the regulatory framework for determining the suitability of the site for the proposed repository concept and providing information for a 1998 viability assessment of continuing toward the licensing of a repository. The Project has also developed a new design approach that uses the advanced conceptual design published during the last reporting period as a base for developing a design that will support the viability assessment. The initial construction phase of the Thermal Testing Facility was completed and the first phase of the in situ heater tests began on schedule. In addition, phase-one construction was completed for the first of two alcoves that will provide access to the Ghost Dance fault.« less
NASA Astrophysics Data System (ADS)
Huang, Wei-Hsing
2017-04-01
Clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation behavior of clay barrier, with emphasis on the coupling effects of heat and moisture of buffer material in the near-field of a repository during groundwater intrusion processes. A locally available clay named "Zhisin clay" and a standard bentotine material were adopted in the laboratory program. Water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the buffer material. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. The finite element program ABAQUS was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on Zhisin clay. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. It was found that, due to the variation in suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.
Brownell, Elizabeth A; Lussier, Mary M; Herson, Victor C; Hagadorn, James I; Marinelli, Kathleen A
2014-02-01
The Human Milk Banking Association of North America (HMBANA) is a nonprofit association that standardizes and facilitates the establishment and operation of donor human milk (DHM) banks in North America. Each HMBANA milk bank in the network collects data on the DHM it receives and distributes, but a centralized data repository does not yet exist. In 2010, the Food and Drug Administration recognized the need to collect and disseminate systematic, standardized DHM bank data and suggested that HMBANA develop a DHM data repository. This study aimed to describe data currently collected by HMBANA DHM banks and evaluate feasibility and interest in participating in a centralized data repository. We conducted phone interviews with individuals in different HMBANA milk banks and summarized descriptive statistics. Eight of 13 (61.5%) sites consented to participate. All respondents collected donor demographics, and half (50%; n = 4) rescreened donors after 6 months of continued donation. The definition of preterm milk varied between DHM banks (≤ 32 to ≤ 40 weeks). The specific computer program used to house the data also differed. Half (50%; n = 4) indicated that they would consider participation in a centralized repository. Without standardized data across all HMBANA sites, the creation of a centralized data repository is not yet feasible. Lack of standardization and transparency may deter implementation of donor milk programs in the neonatal intensive care unit setting and hinder benchmarking, research, and quality improvement initiatives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. Burgess; M. Noakes; P. Spampinato
This paper presents an evaluation of robotics and remote handling technologies that have the potential to increase the efficiency of handling waste packages at the proposed Yucca Mountain High-Level Nuclear Waste Repository. It is expected that increased efficiency will reduce the cost of operations. The goal of this work was to identify technologies for consideration as potential projects that the U.S. Department of Energy Office of Civilian Radioactive Waste Management, Office of Science and Technology International Programs, could support in the near future, and to assess their ''payback'' value. The evaluation took into account the robotics and remote handling capabilitiesmore » planned for incorporation into the current baseline design for the repository, for both surface and subsurface operations. The evaluation, completed at the end of fiscal year 2004, identified where significant advantages in operating efficiencies could accrue by implementing any given robotics technology or approach, and included a road map for a multiyear R&D program for improvements to remote handling technology that support operating enhancements.« less
A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.
2013-07-01
The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less
2011-01-01
The U.S. Congress authorized a library for the U.S. Geological Survey (USGS) in 1879. The library was formally established in 1882 with the naming of the first librarian and began with a staff of three and a collection of 1,400 books. Today, the USGS Libraries Program is one of the world's largest Earth and natural science repositories and a resource of national significance used by researchers and the public worldwide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1985-12-31
In 1982, the Congress enacted the Nuclear Waste Policy Act (Public Law 97-425), which established a comprehensive national program directed toward siting, constructing, and operating geologic repositories for the permanent disposal of high-level radioactive waste. In February 1983, the United States Department of Energy (DOE) identified the nine referenced repository locations as potentially acceptable sites for a mined geologic repository. These sites have been evaluated in accordance with the DOE`s General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories. The DOE findings and determinations are based on the evaluations contained in the draft Environmental Assessments (EA). A finalmore » EA will be prepared after considering the comments received on the draft EA. The purpose of this document is to provide the public with specific site information on each potential repository location.« less
NASA Technical Reports Server (NTRS)
Kuhn, Allan D.
1991-01-01
The Defense Technical Information Center (DTIC), the central repository for DOD scientific and technical information concerning studies and research and engineering efforts, is discussed. The present makeup of DTIC is described and its functions in producing technical reports and technical report bibliographies are examined. DTIC's outreach services are reviewed, as are its DTIC information and technology transfer programs. DTIC's plans for the year 2000 and its relation to the mission of the U.S. Air Force, including the Air Force's STINFO program, are addressed.
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
National Programs | Frederick National Laboratory for Cancer Research
The Frederick National Laboratoryis a shared national resource that offers access to a suite of advanced biomedical technologies, provides selected science and technology services, and maintains vast repositories of research materials available
National Programs | FNLCR Staging
The Frederick National Lab (FNL) is a shared national resource that offers access to a suite of advanced biomedical technologies, provides selected science and technology services, and maintains vast repositories of research materials available to bi
Integrating computer programs for engineering analysis and design
NASA Technical Reports Server (NTRS)
Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.
1983-01-01
The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.
Evolving the Living With a Star Data System Definition
NASA Astrophysics Data System (ADS)
Otranto, J. F.; Dijoseph, M.
2003-12-01
NASA's Living With a Star (LWS) Program is a space weather-focused and applications-driven research program. The LWS Program is soliciting input from the solar, space physics, space weather, and climate science communities to develop a system that enables access to science data associated with these disciplines, and advances the development of discipline and interdisciplinary findings. The LWS Program will implement a data system that builds upon the existing and planned data capture, processing, and storage components put in place by individual spacecraft missions and also inter-project data management systems, including active and deep archives, and multi-mission data repositories. It is technically feasible for the LWS Program to integrate data from a broad set of resources, assuming they are either publicly accessible or allow access by permission. The LWS Program data system will work in coordination with spacecraft mission data systems and science data repositories, integrating their holdings using a common metadata representation. This common representation relies on a robust metadata definition that provides journalistic and technical data descriptions, plus linkages to supporting data products and tools. The LWS Program intends to become an enabling resource to PIs, interdisciplinary scientists, researchers, and students facilitating both access to a broad collection of science data, as well as the necessary supporting components to understand and make productive use of these data. For the LWS Program to represent science data that are physically distributed across various ground system elements, information will be collected about these distributed data products through a series of LWS Program-created agents. These agents will be customized to interface or interact with each one of these data systems, collect information, and forward any new metadata records to a LWS Program-developed metadata library. A populated LWS metadata library will function as a single point-of-contact that serves the entire science community as a first stop for data availability, whether or not science data are physically stored in an LWS-operated repository. Further, this metadata library will provide the user access to information for understanding these data including descriptions of the associated spacecraft and instrument, data format, calibration and operations issues, links to ancillary and correlative data products, links to processing tools and models associated with these data, and any corresponding findings produced using these data. The LWS may also support an active archive for solar, space physics, space weather, and climate data when these data would otherwise be discarded or archived off-line. This archive could potentially serve also as a data storage backup facility for LWS missions. The plan for the LWS Program metadata library is developed based upon input received from the solar and geospace science communities; the library's architecture is based on existing systems developed for serving science metadata. The LWS Program continues to seek constructive input from the science community, examples of both successes and failures in dealing with science data systems, and insights regarding the obstacles between the current state-of-the-practice and this vision for the LWS Program metadata library.
Measurement and Analysis of P2P IPTV Program Resource
Chen, Xingshu; Wang, Haizhou; Zhang, Qi
2014-01-01
With the rapid development of P2P technology, P2P IPTV applications have received more and more attention. And program resource distribution is very important to P2P IPTV applications. In order to collect IPTV program resources, a distributed multi-protocol crawler is proposed. And the crawler has collected more than 13 million pieces of information of IPTV programs from 2009 to 2012. In addition, the distribution of IPTV programs is independent and incompact, resulting in chaos of program names, which obstructs searching and organizing programs. Thus, we focus on characteristic analysis of program resources, including the distributions of length of program names, the entropy of the character types, and hierarchy depth of programs. These analyses reveal the disorderly naming conventions of P2P IPTV programs. The analysis results can help to purify and extract useful information from chaotic names for better retrieval and accelerate automatic sorting of program and establishment of IPTV repository. In order to represent popularity of programs and to predict user behavior and popularity of hot programs over a period, we also put forward an analytical model of hot programs. PMID:24772008
NASA Technical Reports Server (NTRS)
Mathur, F. P.
1972-01-01
Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
Status of the Basalt Waste Isolation Project is given. Three key concerns have been identified that need to be resolved to either confirm or eliminate the basalts as a potential nuclear waste repository host medium. They are: A thorough understanding of the groundwater hydrology beneath the Hanford Site is needed to assure that a repository in basalt will not contribute unacceptable amounts of contaminants to the accessible environment. Our ability to construct a repository shaft and a network of underground tunnels needs to be fully demonstrated through an exploratory shaft program. Our ability to ultimately seal a repository, such thatmore » its integrity and the isolation of the waste are guaranteed, needs to be demonstrated.« less
Basic repository source term and data sheet report: Lavender Canyon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-01-01
This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs.
NASA Technical Reports Server (NTRS)
Mckay, Charles
1991-01-01
This is the configuration management Plan for the AdaNet Repository Based Software Engineering (RBSE) contract. This document establishes the requirements and activities needed to ensure that the products developed for the AdaNet RBSE contract are accurately identified, that proposed changes to the product are systematically evaluated and controlled, that the status of all change activity is known at all times, and that the product achieves its functional performance requirements and is accurately documented.
Education and Outreach Plans for the U.S. Drillship in IODP
NASA Astrophysics Data System (ADS)
White, K. S.; Reagan, M.; Klaus, A. D.
2003-12-01
The Integrated Ocean Drilling Program (IODP) began on October 1, 2003, following the end of operations of the 20-year Ocean Drilling Program (ODP). Education and outreach is a key component of IODP both nationally and internationally. The JOI Alliance (Joint Oceanographic Institutions, Inc., Texas A&M University, and Lamont Doherty Earth Observatory of Columbia University) will lead activities related to the U.S. drillship, coordinating these education and outreach efforts with those undertaken by the Central Management Organization, other IODP platform operators, and a U.S. Science Support Program successor. The Alliance will serve the national and assist the international scientific drilling communities by providing the results from the U.S. vessel to the public, government representatives, and scientists. The Alliance will expand upon media outreach strategies that were successful in ODP, such as issuing press releases at the conclusion of each leg and for major scientific breakthroughs; conducting tours, press conferences, and events during port calls; working with the press at major scientific meetings, and encouraging journalists to sail on expeditions. The Alliance will increase its education role by developing, coordinating, and disseminating educational materials and programs for teachers and students on the scientific themes and discoveries of IODP science. An important component of the outreach plan is using the vessel and associated laboratories and repositories as classrooms. IODP plans include multiple ship berths each year for teachers, based on the success of a pilot program conducted by ODP in 2001. This program, featuring a teacher onboard for a cruise, was accompanied by a distance-learning program and on-line curriculum models. Teachers can tour, both virtually and directly, laboratories and core repositories and participate in scheduled activities and courses. Using science conducted onboard the ship, the Alliance will develop online curriculum materials, as well as publications and fact sheets geared toward nonscientists. The Alliance will partner with existing scientific and education organizations, including programs at their universities, to widely disseminate IODP results and materials.
Building a genome database using an object-oriented approach.
Barbasiewicz, Anna; Liu, Lin; Lang, B Franz; Burger, Gertraud
2002-01-01
GOBASE is a relational database that integrates data associated with mitochondria and chloroplasts. The most important data in GOBASE, i. e., molecular sequences and taxonomic information, are obtained from the public sequence data repository at the National Center for Biotechnology Information (NCBI), and are validated by our experts. Maintaining a curated genomic database comes with a towering labor cost, due to the shear volume of available genomic sequences and the plethora of annotation errors and omissions in records retrieved from public repositories. Here we describe our approach to increase automation of the database population process, thereby reducing manual intervention. As a first step, we used Unified Modeling Language (UML) to construct a list of potential errors. Each case was evaluated independently, and an expert solution was devised, and represented as a diagram. Subsequently, the UML diagrams were used as templates for writing object-oriented automation programs in the Java programming language.
SINGLE HEATER TEST FINAL REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.B. Cho
The Single Heater Test is the first of the in-situ thermal tests conducted by the U.S. Department of Energy as part of its program of characterizing Yucca Mountain in Nevada as the potential site for a proposed deep geologic repository for the disposal of spent nuclear fuel and high-level nuclear waste. The Site Characterization Plan (DOE 1988) contained an extensive plan of in-situ thermal tests aimed at understanding specific aspects of the response of the local rock-mass around the potential repository to the heat from the radioactive decay of the emplaced waste. With the refocusing of the Site Characterization Planmore » by the ''Civilian Radioactive Waste Management Program Plan'' (DOE 1994), a consolidated thermal testing program emerged by 1995 as documented in the reports ''In-Situ Thermal Testing Program Strategy'' (DOE 1995) and ''Updated In-Situ Thermal Testing Program Strategy'' (CRWMS M&O 1997a). The concept of the Single Heater Test took shape in the summer of 1995 and detailed planning and design of the test started with the beginning fiscal year 1996. The overall objective of the Single Heater Test was to gain an understanding of the coupled thermal, mechanical, hydrological, and chemical processes that are anticipated to occur in the local rock-mass in the potential repository as a result of heat from radioactive decay of the emplaced waste. This included making a priori predictions of the test results using existing models and subsequently refining or modifying the models, on the basis of comparative and interpretive analyses of the measurements and predictions. A second, no less important, objective was to try out, in a full-scale field setting, the various instruments and equipment to be employed in the future on a much larger, more complex, thermal test of longer duration, such as the Drift Scale Test. This ''shake down'' or trial aspect of the Single Heater Test applied not just to the hardware, but also to the teamwork and cooperation between multiple organizations performing their part in the test.« less
Current Status of The Romanian National Deep Geological Repository Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radu, M.; Nicolae, R.; Nicolae, D.
2008-07-01
Construction of a deep geological repository is a very demanding and costly task. By now, countries that have Candu reactors, have not processed the spent fuel passing to the interim storage as a preliminary step of final disposal within the nuclear fuel cycle back-end. Romania, in comparison to other nations, represents a rather small territory, with high population density, wherein the geological formation areas with radioactive waste storage potential are limited and restricted not only from the point of view of the selection criteria due to the rocks natural characteristics, but also from the point of view of their involvementmore » in social and economical activities. In the framework of the national R and D Programs, series of 'Map investigations' have been made regarding the selection and preliminary characterization of the host geological formation for the nation's spent fuel deep geological repository. The fact that Romania has many deposits of natural gas, oil, ore and geothermal water, and intensively utilizes soil and also is very forested, cause some of the apparent acceptable sites to be rejected in the subsequent analysis. Currently, according to the Law on the spent fuel and radioactive waste management, including disposal, The National Agency of Radioactive Waste is responsible and coordinates the national strategy in the field and, subsequently, further actions will be decided. The Romanian National Strategy, approved in 2004, projects the operation of a deep geological repository to begin in 2055. (authors)« less
Ragoussi, Maria-Eleni; Costa, Davide
2017-03-14
For the last 30 years, the NEA Thermochemical Database (TDB) Project (www.oecd-nea.org/dbtdb/) has been developing a chemical thermodynamic database for elements relevant to the safety of radioactive waste repositories, providing data that are vital to support the geochemical modeling of such systems. The recommended data are selected on the basis of strict review procedures and are characterized by their consistency. The results of these efforts are freely available, and have become an international point of reference in the field. As a result, a number of important national initiatives with regard to waste management programs have used the NEA TDB as their basis, both in terms of recommended data and guidelines. In this article we describe the fundamentals and achievements of the project together with the characteristics of some databases developed in national nuclear waste disposal programs that have been influenced by the NEA TDB. We also give some insights on how this work could be seen as an approach to be used in broader areas of environmental interest. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-06-01
The study subject of this meeting was the adsorption and desorption of radionuclides on geologic media under repository conditions. This volume contans eight papers. Separate abstracts were prepared for all eight papers. (DLC)
NCTN/NCORP Data Archive: Expanding Access to Clinical Trial Data
NCI is launching the NCTN/NCORP Data Archive, a centralized repository of patient-level data from phase III clinical trials conducted by NCI’s NCTN and NCORP trials programs and the National Cancer Institute of Canada-Clinical Trials Group.
Review of DOE Waste Package Program. Semiannual report, October 1984-March 1985. Volume 8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, M.S.
1985-12-01
A large number of technical reports on waste package component performance were reviewed over the last year in support of the NRC`s review of the Department of Energy`s (DOE`s) Environmental Assessment reports. The intent was to assess in some detail the quantity and quality of the DOE data and their relevance to the high-level waste repository site selection process. A representative selection of the reviews is presented for the salt, basalt, and tuff repository projects. Areas for future research have been outlined. 141 refs.
2014-01-01
tempo may raise the risk for mental health challenges. During this time, the U.S. Department of Defense (DoD) has implemented numerous programs to...and were based on the constraints of each electronic database. However, most searches were variations on a basic three-category format: The first...Gerontology, 1983, 38: 111–116. Iannuzzo RW, Jaeger J, Goldberg JF, Kafantaris V, Sublette ME. “Development and Reliability of the Ham-D/MADRS
Rolling Deck to Repository (R2R): Standards and Semantics for Open Access to Research Data
NASA Astrophysics Data System (ADS)
Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen
2015-04-01
In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data Centers and the EU SeaDataNet project. R2R is a partner in the Ocean Data Interoperability Platform (ODIP), working to strengthen links among regional and national data systems, as well as a lead partner in the EarthCube "GeoLink" project, developing a standard set of ontology design patterns for publishing research data using Semantic Web protocols.
Bellman's GAP--a language and compiler for dynamic programming in sequence analysis.
Sauthoff, Georg; Möhl, Mathias; Janssen, Stefan; Giegerich, Robert
2013-03-01
Dynamic programming is ubiquitous in bioinformatics. Developing and implementing non-trivial dynamic programming algorithms is often error prone and tedious. Bellman's GAP is a new programming system, designed to ease the development of bioinformatics tools based on the dynamic programming technique. In Bellman's GAP, dynamic programming algorithms are described in a declarative style by tree grammars, evaluation algebras and products formed thereof. This bypasses the design of explicit dynamic programming recurrences and yields programs that are free of subscript errors, modular and easy to modify. The declarative modules are compiled into C++ code that is competitive to carefully hand-crafted implementations. This article introduces the Bellman's GAP system and its language, GAP-L. It then demonstrates the ease of development and the degree of re-use by creating variants of two common bioinformatics algorithms. Finally, it evaluates Bellman's GAP as an implementation platform of 'real-world' bioinformatics tools. Bellman's GAP is available under GPL license from http://bibiserv.cebitec.uni-bielefeld.de/bellmansgap. This Web site includes a repository of re-usable modules for RNA folding based on thermodynamics.
78 FR 63455 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
..., Building 23, Columbus, OH 43213-1152. Defense Manpower Data Center, 400 Gigling Road, Seaside CA 93955... web-based system providing a repository of military, Government civilian and contractor personnel and..., tracking, reporting, evaluating program effectiveness and conducting research. The Total Operational...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
A Predictive Approach to Eliminating Errors in Software Code
NASA Technical Reports Server (NTRS)
2006-01-01
NASA s Metrics Data Program Data Repository is a database that stores problem, product, and metrics data. The primary goal of this data repository is to provide project data to the software community. In doing so, the Metrics Data Program collects artifacts from a large NASA dataset, generates metrics on the artifacts, and then generates reports that are made available to the public at no cost. The data that are made available to general users have been sanitized and authorized for publication through the Metrics Data Program Web site by officials representing the projects from which the data originated. The data repository is operated by NASA s Independent Verification and Validation (IV&V) Facility, which is located in Fairmont, West Virginia, a high-tech hub for emerging innovation in the Mountain State. The IV&V Facility was founded in 1993, under the NASA Office of Safety and Mission Assurance, as a direct result of recommendations made by the National Research Council and the Report of the Presidential Commission on the Space Shuttle Challenger Accident. Today, under the direction of Goddard Space Flight Center, the IV&V Facility continues its mission to provide the highest achievable levels of safety and cost-effectiveness for mission-critical software. By extending its data to public users, the facility has helped improve the safety, reliability, and quality of complex software systems throughout private industry and other government agencies. Integrated Software Metrics, Inc., is one of the organizations that has benefited from studying the metrics data. As a result, the company has evolved into a leading developer of innovative software-error prediction tools that help organizations deliver better software, on time and on budget.
Enhancing Ocean Research Data Access
NASA Astrophysics Data System (ADS)
Chandler, Cynthia; Groman, Robert; Shepherd, Adam; Allison, Molly; Arko, Robert; Chen, Yu; Fox, Peter; Glover, David; Hitzler, Pascal; Leadbetter, Adam; Narock, Thomas; West, Patrick; Wiebe, Peter
2014-05-01
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. While the ultimate goal of the BCO-DMO is to ensure preservation of NSF funded project data and to provide open access to those data, achievement of those goals is attained through a series of related phases that benefits from active collaboration and cooperation with a large community of research scientists as well as curators of data and information at complementary data repositories. The BCO-DMO is just one of many intermediate data management centers created to facilitate long-term preservation of data and improve access to ocean research data. Through partnerships with other data management professionals and active involvement in local and global initiatives, BCO-DMO staff members are working to enhance access to ocean research data available from the online BCO-DMO data system. Continuing efforts in use of controlled vocabulary terms, development of ontology design patterns and publication of content as Linked Open Data are contributing to improved discovery and availability of BCO-DMO curated data and increased interoperability of related content available from distributed repositories. We will demonstrate how Semantic Web technologies (e.g. RDF/XML, SKOS, OWL and SPARQL) have been integrated into BCO-DMO data access and delivery systems to better serve the ocean research community and to contribute to an expanding global knowledge network.
Semantic framework for mapping object-oriented model to semantic web languages
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework. PMID:25762923
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, Michael J; Bredehoeft, John D., Dr.
2010-09-03
Inyo County completed the first year of the U.S. Department of Energy Grant Agreement No. DE-RW0000233. This report presents the results of research conducted within this Grant agreement in the context of Inyo County's Yucca Mountain oversight program goals and objectives. The Hydrodynamics Group, LLC prepared this report for Inyo County Yucca Mountain Repository Assessment Office. The overall goal of Inyo County's Yucca Mountain research program is the evaluation of far-field issues related to potential transport, by ground water, of radionuclide into Inyo County, including Death Valley, and the evaluation of a connection between the Lower Carbonate Aquifer (LCA) andmore » the biosphere. Data collected within the Grant is included in interpretive illustrations and discussions of the results of our analysis. The centeral elements of this Grant prgoram was the drilling of exploratory wells, geophysical surveys, geological mapping of the Southern Funeral Mountain Range. The cullimination of this research was 1) a numerical ground water model of the Southern Funeral Mountain Range demonstrating the potential of a hydraulic connection between the LCA and the major springs in the Furnace Creek area of Death Valley, and 2) a numerical ground water model of the Amargosa Valley to evaluate the potential for radionuclide transport from Yucca Mountain to Inyo County, California. The report provides a description of research and activities performed by The Hydrodynamics Group, LLC on behalf of Inyo County, and copies of key work products in attachments to this report.« less
International Collaboration Activities on Engineered Barrier Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jove-Colon, Carlos F.
The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less
Semantic framework for mapping object-oriented model to semantic web languages.
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald
Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried outmore » at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)« less
10 CFR 63.143 - Implementation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Implementation. 63.143 Section 63.143 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.143 Implementation. DOE shall implement a quality assurance program...
Scrubchem: Building Bioactivity Datasets from Pubchem Bioassay Data (SOT)
The PubChem Bioassay database is a non-curated public repository with data from 64 sources, including: ChEMBL, BindingDb, DrugBank, EPA Tox21, NIH Molecular Libraries Screening Program, and various other academic, government, and industrial contributors. Methods for extracting th...
10 CFR 63.132 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...
10 CFR 63.132 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...
10 CFR 63.132 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...
10 CFR 60.141 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2012 CFR
2012-01-01
... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...
10 CFR 60.141 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2014 CFR
2014-01-01
... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...
10 CFR 60.141 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2013 CFR
2013-01-01
... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...
10 CFR 63.132 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...
10 CFR 63.132 - Confirmation of geotechnical and design parameters.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...
An overview of platforms for cloud based development.
Fylaktopoulos, G; Goumas, G; Skolarikis, M; Sotiropoulos, A; Maglogiannis, I
2016-01-01
This paper provides an overview of the state of the art technologies for software development in cloud environments. The surveyed systems cover the whole spectrum of cloud-based development including integrated programming environments, code repositories, software modeling, composition and documentation tools, and application management and orchestration. In this work we evaluate the existing cloud development ecosystem based on a wide number of characteristics like applicability (e.g. programming and database technologies supported), productivity enhancement (e.g. editor capabilities, debugging tools), support for collaboration (e.g. repository functionality, version control) and post-development application hosting and we compare the surveyed systems. The conducted survey proves that software engineering in the cloud era has made its initial steps showing potential to provide concrete implementation and execution environments for cloud-based applications. However, a number of important challenges need to be addressed for this approach to be viable. These challenges are discussed in the article, while a conclusion is drawn that although several steps have been made, a compact and reliable solution does not yet exist.
U.S. Virgin Islands Petroleum Price-Spike Preparation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.
2012-06-01
This NREL technical report details a plan for the U.S. Virgin Islands (USVI) to minimize the economic damage caused by major petroleum price increases. The assumptions for this plan are that the USVI will have very little time and money to implement it and that the population will be highly motivated to follow it because of high fuel prices. The plan's success, therefore, is highly dependent on behavior change. This plan was derived largely from a review of the actions taken and behavior changes made by companies and commuters throughout the United States in response to the oil price spikemore » of 2008. Many of these solutions were coordinated by or reported through the 88 local representatives of the U.S. Department of Energy's Clean Cities program. The National Renewable Energy Laboratory provides technical and communications support for the Clean Cities program and therefore serves as a de facto repository of these solutions. This plan is the first publication that has tapped this repository.« less
Office of Science and Technology&International Year EndReport - 2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodvarsson, G.S.
2005-10-27
Source Term, Materials Performance, Radionuclide Getters, Natural Barriers, and Advanced Technologies, a brief introduction in each section describes the overall organization and goals of each program area. All of these areas have great potential for improving our understanding of the safety performance of the proposed Yucca Mountain repository, as processes within these areas are generally very conservatively represented in the Total System Performance Assessment. In addition, some of the technology thrust areas in particular may enhance system efficiency and reduce risk to workers. Thus, rather modest effort in the S&T Program could lead to large savings in the lifetime repositorymore » total cost and significantly enhanced understanding of the behavior of the proposed Yucca Mountain repository, without safety being compromised, and in some instances being enhanced. An overall strength of the S&T Program is the significant amount of integration that has already been achieved after two years of research. As an example (illustrated in Figure 1), our understanding of the behavior of the total waste isolation system has been enhanced through integration of the Source Term, Materials Performance, and Natural Barriers Thrust areas. All three thrust areas contribute to the integration of different processes in the in-drift environment. These processes include seepage into the drift, dust accumulation on the waste package, brine formation and precipitation on the waste package, mass transfer through the fuel cladding, changes in the seepage-water chemical composition, and transport of released radionuclides through the invert and natural barriers. During FY2005, each of our program areas assembled a team of external experts to conduct an independent review of their respective projects, research directions, and emphasis. In addition, the S&T Program as a whole was independently reviewed by the S&T Programmatic Evaluation Panel. As a result of these reviews, adjustments to the S&T Program will be implemented in FY2006 to ensure that the Program is properly aligned with OCRWM's priorities. Also during FY2005, several programmatic documents were published, including the Science and Technology Program Strategic Plan, the Science and Technology Program Management Plan, and the Science and Technology Program Plan. These and other communication products are available on the OCRWM web site under the Science and Technology section (http://www.ocrwm.doe.gov/osti/index.shtml).« less
Program for computer aided reliability estimation
NASA Technical Reports Server (NTRS)
Mathur, F. P. (Inventor)
1972-01-01
A computer program for estimating the reliability of self-repair and fault-tolerant systems with respect to selected system and mission parameters is presented. The computer program is capable of operation in an interactive conversational mode as well as in a batch mode and is characterized by maintenance of several general equations representative of basic redundancy schemes in an equation repository. Selected reliability functions applicable to any mathematical model formulated with the general equations, used singly or in combination with each other, are separately stored. One or more system and/or mission parameters may be designated as a variable. Data in the form of values for selected reliability functions is generated in a tabular or graphic format for each formulated model.
Variable thickness transient ground-water flow model. Volume 3. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenauer, A.E.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less
Oceanotron, Scalable Server for Marine Observations
NASA Astrophysics Data System (ADS)
Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.
2013-12-01
Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to specific data formats or protocols. Oceanotron is deployed at seven European data centres for marine in-situ observations within myOcean. While additional extensions are still being developed, to promote new collaborative initiatives, a work is now done on continuous and distributed integration (jenkins, maven), shared reference documentation (on alfresco) and code and release dissemination (sourceforge, github).
Statistical sensitivity analysis of a simple nuclear waste repository model
NASA Astrophysics Data System (ADS)
Ronen, Y.; Lucius, J. L.; Blow, E. M.
1980-06-01
A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.
The Revised WIPP Passive Institutional Controls Program - A Conceptual Plan - 13145
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patterson, Russ; Klein, Thomas; Van Luik, Abraham
2013-07-01
The Department of Energy/Carlsbad Field Office (DOE/CBFO) is responsible for managing all activities related to the disposal of TRU and TRU-mixed waste in the geologic repository, 650 m below the land surface, at WIPP, near Carlsbad, New Mexico. The main function of the Passive Institutional Controls (PIC's) program is to inform future generations of the long-lived radioactive wastes buried beneath their feet in the desert. For the first 100 years after cessation of disposal operations, the rooms are closed and the shafts leading underground sealed, WIPP is mandated by law to institute Active Institutional Controls (AIC's) with fences, gates, andmore » armed guards on patrol. At this same time a plan must be in place of how to warn/inform the future, after the AIC's are gone, of the consequences of intrusion into the geologic repository disposal area. A plan was put into place during the 1990's with records management and storage, awareness triggers, permanent marker design concepts and testing schedules. This work included the thoughts of expert panels and individuals. The plan held up under peer review and met the requirements of the U.S. Environmental Protection Agency (EPA). Today the NEA is coordinating a study called the 'Preservation of Records, Knowledge and Memory (RK and M) Across Generations' to provide the international nuclear waste repository community with a guide on how a nuclear record archive programs should be approached and developed. CBFO is cooperating and participating in this project and will take what knowledge is gained and apply that to the WIPP program. At the same time CBFO is well aware that the EPA and others are expecting DOE to move forward with planning for the future WIPP PIC's program; so a plan will be in place in time for WIPP's closure slated for the early 2030's. The DOE/CBFO WIPP PIC's program in place today meets the regulatory criteria, but complete feasibility of implementation is questionable, and may not be in conformance with the international guidance being developed. International guidance currently under development may suggest that the inter-generational equity principle strives to warn the future, however, in doing so not to unduly burden present generations. Building markers and monuments that are out of proportion to the risk being presented to the future is not in keeping with generational equity. With this in mind the DOE/CBFO is developing conceptual plans for re-evaluating and revising the current WIPP PIC's program. These conceptual plans will suggest scientific and technical work that must be completed to develop a 'new' PICs program that takes the best ideas of the present plan, blended with new ideas from the RK and M project, and proposed alternative permanent markers designs and materials in consideration. (authors)« less
75 FR 71133 - National Institute of Mental Health; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
... Emphasis Panel; Competitive Revision for Stem Cell Repository Relevant to Mental Disorders. Date: December... Domestic Assistance Program Nos. 93.242, Mental Health Research Grants; 93.281, Scientist Development Award, Scientist Development Award for Clinicians, and Research Scientist Award; 93.282, Mental Health National...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-05
... (DHS), Science and Technology, Protected Repository for the Defense of Infrastructure Against Cyber Threats (PREDICT) Program AGENCY: Science and Technology Directorate, DHS. ACTION: 30-Day notice and request for comment. SUMMARY: The Department of Homeland Security (DHS), Science & Technology (S&T...
10 CFR 2.1003 - Availability of material.
Code of Federal Regulations, 2011 CFR
2011-01-01
... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...
10 CFR 2.1003 - Availability of material.
Code of Federal Regulations, 2012 CFR
2012-01-01
... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2010-01-01 2010-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2013-01-01 2013-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2012-01-01 2012-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2014-01-01 2014-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2011 CFR
2011-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2011-01-01 2011-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter
2016-09-01
The goal of the Fifth Worldwide Review is to document evolution in the state-of-the-art of approaches for nuclear waste disposal in geological formations since the Fourth Worldwide Review that was released in 2006. The last ten years since the previous Worldwide Review has seen major developments in a number of nations throughout the world pursuing geological disposal programs, both in preparing and reviewing safety cases for the operational and long-term safety of proposed and operating repositories. The countries that are approaching implementation of geological disposal will increasingly focus on the feasibility of safely constructing and operating their repositories in short-more » and long terms on the basis existing regulations. The WWR-5 will also address a number of specific technical issues in safety case development along with the interplay among stakeholder concerns, technical feasibility, engineering design issues, and operational and post-closure safety. Preparation and publication of the Fifth Worldwide Review on nuclear waste disposal facilitates assessing the lessons learned and developing future cooperation between the countries. The Report provides scientific and technical experiences on preparing for and developing scientific and technical bases for nuclear waste disposal in deep geologic repositories in terms of requirements, societal expectations and the adequacy of cases for long-term repository safety. The Chapters include potential issues that may arise as repository programs mature, and identify techniques that demonstrate the safety cases and aid in promoting and gaining societal confidence. The report will also be used to exchange experience with other fields of industry and technology, in which concepts similar to the design and safety cases are applied, as well to facilitate the public perception and understanding of the safety of the disposal approaches relative to risks that may increase over long times frames in the absence of a successful implementation of final dispositioning.« less
New Features of the re3data Registry of Research Data Repositories
NASA Astrophysics Data System (ADS)
Elger, K.; Pampel, H.; Vierkant, P.; Witt, M.
2016-12-01
re3data is a registry of research data repositories that lists over 1,600 repositories from around the world, making it the largest and most comprehensive online catalog of data repositories on the web. The registry offers researchers, funding agencies, libraries and publishers a comprehensive overview of the heterogeneous landscape of data repositories. The repositories are described, following the "Metadata Schema for the Description of Research Data Repositories". re3data summarises the properties of a repository into a user-friendly icon system helping users to easily identify an adequate repository for the storage of their datasets. The re3data entries are curated by an international, multi-disciplinary editorial board. An application programming interface (API) enables other information systems to list and fetch metadata for integration and interoperability. Funders like the European Commission (2015) and publishers like Springer Nature (2016) recommend the use of re3data.org in their policies. The original re3data project partners are the GFZ German Research Centre for Geosciences, the Humboldt-Universität zu Berlin, the Purdue University Libraries and the Karlsruhe Institute of Technology (KIT). Since 2015 re3data is operated as a service of DataCite, a global non-profit organisation that provides persistent identifiers (DOIs) for research data. At the 2016 AGU Fall Meeting we will describe the current status of re3data. An overview of the major developments and new features will be given. Furthermore, we will present our plans to increase the quality of the re3data entries.
Increasing access to program information: a strategy for improving adolescent health.
Brindis, Claire D; Hair, Elizabeth C; Cochran, Stephanie; Cleveland, Kevin; Valderrama, L Teresa; Park, M Jane
2007-01-01
To identify existing programs serving 11- to 15-year-olds that aim to improve adolescent health in the areas of Health & Well-being, Fitness, Family & Peer Relationships, School Environment, Smoking, Alcohol Use, and Violence and to assess the utility of readily available resources in providing detailed program information. In Phase 1, publicly available program databases were searched to identify potential programs serving the target population. In Phase 2, an in-depth search of a limited sample of programs meeting the content and age criteria was performed to identify program descriptors. Over 1,000 program names were identified in Phase 1. Information regarding programs is becoming more readily available through the internet; however, the program information that was publicly available only begins to draw the picture. Phase 2 revealed that a broad array of efforts are underway in all seven content areas, but found information on the program descriptors to be limited. Investment in programming is not enough; an upfront investment in communication and information sharing is critical in order to maximize the resources dedicated to the improvement of adolescent health. A well-publicized centralized program repository offered in conjunction with technical assistance would provide an efficient mechanism for this information sharing. We further suggest that the inherent gap between research and practice can be lessened by building a new body of practice knowledge. This would require improved program data collection by programs, the incorporation of program participation information in national surveys and enhanced evaluation efforts.
A perspective on the proliferation risks of plutonium mines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyman, E.S.
1996-05-01
The program of geologic disposal of spent fuel and other plutonium-containing materials is increasingly becoming the target of criticism by individuals who argue that in the future, repositories may become low-cost sources of fissile material for nuclear weapons. This paper attempts to outline a consistent framework for analyzing the proliferation risks of these so-called {open_quotes}plutonium mines{close_quotes} and putting them into perspective. First, it is emphasized that the attractiveness of plutonium in a repository as a source of weapons material depends on its accessibility relative to other sources of fissile material. Then, the notion of a {open_quotes}material production standard{close_quotes} (MPS) ismore » proposed: namely, that the proliferation risks posed by geologic disposal will be acceptable if one can demonstrate, under a number of reasonable scenarios, that the recovery of plutonium from a repository is likely to be as difficult as new production of fissile material. A preliminary analysis suggests that the range of circumstances under which current mined repository concepts would fail to meet this standard is fairly narrow. Nevertheless, a broad application of the MPS may impose severe restrictions on repository design. In this context, the relationship of repository design parameters to easy of recovery is discussed.« less
Simms, Andrew M; Toofanny, Rudesh D; Kehl, Catherine; Benson, Noah C; Daggett, Valerie
2008-06-01
Dynameomics is a project to investigate and catalog the native-state dynamics and thermal unfolding pathways of representatives of all protein folds using solvated molecular dynamics simulations, as described in the preceding paper. Here we introduce the design of the molecular dynamics data warehouse, a scalable, reliable repository that houses simulation data that vastly simplifies management and access. In the succeeding paper, we describe the development of a complementary multidimensional database. A single protein unfolding or native-state simulation can take weeks to months to complete, and produces gigabytes of coordinate and analysis data. Mining information from over 3000 completed simulations is complicated and time-consuming. Even the simplest queries involve writing intricate programs that must be built from low-level file system access primitives and include significant logic to correctly locate and parse data of interest. As a result, programs to answer questions that require data from hundreds of simulations are very difficult to write. Thus, organization and access to simulation data have been major obstacles to the discovery of new knowledge in the Dynameomics project. This repository is used internally and is the foundation of the Dynameomics portal site http://www.dynameomics.org. By organizing simulation data into a scalable, manageable and accessible form, we can begin to address substantial questions that move us closer to solving biomedical and bioengineering problems.
USDA-ARS?s Scientific Manuscript database
The USDA-ARS Tropical Agriculture Research Station is the only research entity within the National Plant Germplasm system in the insular Caribbean region. It houses germplasm collections of cultivated tropical/subtropical germplasm of bananas/plantains, cacao, mamey sapote, sapodilla, Spanish lime,...
USAF Hearing Conservation Program, DOEHRS Data Repository Annual Report: CY2014
2016-02-01
tinnitus . The goal was to align the DOEHRS-HC DR data with DoD Hearing Conservation and Readiness Working Group initiatives and Government...Accountability Office recommendations [3]. The data collected from the standardized tinnitus questions are projected to be mined by the DoD in future studies
At the Creation: Chaos, Control, and Automation--Commercial Software Development for Archives.
ERIC Educational Resources Information Center
Drr, W. Theodore
1988-01-01
An approach to the design of flexible text-based management systems for archives includes tiers for repository, software, and user management systems. Each tier has four layers--objective, program, result, and interface. Traps awaiting software development companies involve the market, competition, operations, and finance. (10 references) (MES)
Jean C. Zenklusen, M.S., Ph.D., Discusses the NCI Genomics Data Commons at AACR 2014 - TCGA
At the AACR 2014 meeting, Dr. Jean C. Zenklusen, Director of The Cancer Genome Atlas Program Office, highlights the Genomics Data Commons, a harmonized data repository that will allow simultaneous access and analysis of NCI genomics data, including The Ca
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
... in 1985, ViCAP serves as the national repository for violent crimes; specifically: Homicides and attempted homicides, especially those that (a) involve an abduction, (b) are apparently random, motiveless... homicide. Comprehensive case information submitted to ViCAP is maintained in the ViCAP Web National Crime...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-24
... 1985, ViCAP serves as the national repository for violent crimes; specifically: Homicides and attempted homicides, especially those that (a) involve an abduction, (b) are apparently random, motiveless, or... missing. Unidentified human remains, where the manner of death is known or suspected to be homicide...
System Description and Status Report: California Education Information System.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
The California Education Information System (CEIS) consists of two subsystems of computer programs designed to process business and pupil data for local school districts. Creating and maintaining records concerning the students in the schools, the pupil subsystem provides for a central repository of school district identification information and a…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-23
... (DHS), Science and Technology, Protected Repository for the Defense of Infrastructure Against Cyber... the Defense of Infrastructure against Cyber Threats (PREDICT) program, and is a revision of a... operational data for use in cyber security research and development through the establishment of distributed...
Levich, R.A.; Linden, R.M.; Patterson, R.L.; Stuckless, J.S.
2000-01-01
Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program. The first day focuses on the regional setting with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The field trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, an element of the hydrologic system that historically has received little attention. Discussions during the second day will compromise selected topics of Yucca Mountain geology, hydrology and geochemistry and will include the probabilistic volcanic hazard analysis and the seismicity and seismic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the results of recent hydrologic studies by the Nye County Nuclear Waste Program Office, and the relationship of the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.
Developments of AMS at the TANDAR accelerator
NASA Astrophysics Data System (ADS)
Fernández Niello, J. O.; Abriola, D.; Alvarez, D. E.; Capurro, O. A.; di Tada, M.; Etchegoyen, A.; Ferrero, A. M. J.; Martí, G. V.; Pacheco, A. J.; Testoni, J. E.; Korschinek, G.
1996-08-01
Man-made long-lived radioisotopes have been produced as a result of different nuclear technologies. The study of accidental spillages and the determination of radioisotope concentrations in nuclear waste prior to final storage in a repository are subjects of great interest in connection with this activity. The accelerator mass spectrometry (AMS) technique is a powerful tool to measure long-lived isotopes at abundance ratios as low as 10 -12-10 -15 in small samples. Applications to the Argentine nuclear program like those mentioned above, as well as applications to archaeology, hydrology and biomedical research, are considered in an AMS program using the TANDAR 20 UD electrostatic accelerator at Buenos Aires. In this work we present the status of the program and a description of the facility.
Ryals, G.N.
1980-01-01
The National Waste Terminal Storage Program is an effort by the U.S. Department of Energy to locate and develop sites for disposal or storage of commercially produced radioactive wastes. As part of this program, salt domes in the northern Louisiana salt-dome basin are being studied to determine their suitability as repositories. Part of the U.S. Geological Survey 's participation in the program has been to describe the regional geohydrology of the northern Louisiana salt-dome basin. A map based on a compilation of published data and the interpretation of electrical logs shows the altitude of the base of freshwater in aquifers in the northern Louisiana salt-dome basin. (USGS)
National Aeronautics and Space Administration Biological Specimen Repository
NASA Technical Reports Server (NTRS)
McMonigal, Kathleen A.; Pietrzyk, Robert a.; Johnson, Mary Anne
2008-01-01
The National Aeronautics and Space Administration Biological Specimen Repository (Repository) is a storage bank that is used to maintain biological specimens over extended periods of time and under well-controlled conditions. Samples from the International Space Station (ISS), including blood and urine, will be collected, processed and archived during the preflight, inflight and postflight phases of ISS missions. This investigation has been developed to archive biosamples for use as a resource for future space flight related research. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can study space flight related changes and investigate physiological markers. The development of the National Aeronautics and Space Administration Biological Specimen Repository will allow for the collection, processing, storage, maintenance, and ethical distribution of biosamples to meet goals of scientific and programmatic relevance to the space program. Archiving of the biosamples will provide future research opportunities including investigating patterns of physiological changes, analysis of components unknown at this time or analyses performed by new methodologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tietze-Jaensch, Holger; Schneider, Stephan; Aksyutina, Yuliya
2012-07-01
The German product quality control is inter alia responsible for control of two radioactive waste forms of heat generating waste: a) homogeneous vitrified HLW and b) heterogeneous compacted hulls, end-pieces and technological metallic waste. In either case, significantly different metrology is employed at the site of the conditioning plant for the obligatory nuclide inventory declaration. To facilitate an independent evaluation and checking of the accompanying documentation numerical simulations are carried out. The physical and chemical properties of radioactive waste residues are used to assess the data consistency and uncertainty margins, as well as to predict the long-term behavior of themore » radioactive waste. This is relevant for repository acceptance and safety considerations. Our new numerical approach follows a bottom-up simulation starting from the burn-up behavior of the fuel elements in the reactor core. The output of these burn-up calculations is then coupled with a program that simulates the material separation in the subsequent dissolution and extraction processes normalized to the mass balance. Follow-up simulations of the separated reprocessing lines of a) the vitrification of highly-active liquid and b) the compaction of residual intermediate-active metallic hulls remaining after fuel pellets dissolution, end-pieces and technological waste, allows calculating expectation values for the various repository relevant properties of either waste stream. The principles of the German product quality control of radioactive waste residues from the spent fuel reprocessing have been introduced and explained. Namely, heat generating homogeneous vitrified HLW and heterogeneous compacted metallic MLW have been discussed. The advantages of a complementary numerical property simulation have been made clear and examples of benefits are presented. We have compiled a new program suite to calculate the physical and radio-chemical properties of common nuclear waste residues. The immediate benefit is the independent assessment of radio-active inventory declarations and much facilitated product quality control of waste residues that need to be returned to Germany and submitted to a German HLW-repository requirements. Wherever possible, internationally accepted standard programs are used and embedded. The innovative coupling of burn-up calculations (SCALE) with neutron and gamma transport codes (MCPN-X) allows an application in the world of virtual waste properties. If-then-else scenarios of hypothetical waste material compositions and distributions provide valuable information of long term nuclide property propagation under repository conditions over a very long time span. Benchmarking the program with real residue data demonstrates the power and remarkable accuracy of this numerical approach, boosting the reliability of the confidence aforementioned numerous applications, namely the proof tool set for on-the-spot production quality checking and data evaluation and independent verification. Moreover, using the numerical bottom-up approach helps to avoid the accumulation of fake activities that may gradually build up in a repository from the so-called conservative or penalizing nuclide inventory declarations. The radioactive waste properties and the hydrolytic and chemical stability can be predicted. The interaction with invasive chemicals can be assessed and propagation scenarios can be developed from reliable and sound data and HLW properties. Hence, the appropriate design of a future HLW repository can be based upon predictable and quality assured waste characteristics. (authors)« less
Olney, Richard S.; Ailes, Elizabeth C.; Sontag, Marci K.
2015-01-01
In 2011, statewide newborn screening programs for critical congenital heart defects began in the United States, and subsequently screening has been implemented widely. In this review, we focus on data reports and collection efforts related to both prenatal diagnosis and newborn screening. Defect-specific, maternal, and geographic factors are associated with variations in prenatal detection, so newborn screening provides a population-wide safety net for early diagnosis. A new web-based repository is collecting information on newborn screening program policies, quality indicators related to screening programs, and specific case-level data on infants with these defects. Birth defects surveillance programs also collect data about critical congenital heart defects, particularly related to diagnostic timing, mortality, and services. Individuals from state programs, federal agencies, and national organizations will be interested in these data to further refine algorithms for screening in normal newborn nurseries, neonatal intensive care settings, and other special populations; and ultimately to evaluate the impact of screening on outcomes. PMID:25979782
Olney, Richard S; Ailes, Elizabeth C; Sontag, Marci K
2015-04-01
In 2011, statewide newborn screening programs for critical congenital heart defects began in the United States, and subsequently screening has been implemented widely. In this review, we focus on data reports and collection efforts related to both prenatal diagnosis and newborn screening. Defect-specific, maternal, and geographic factors are associated with variations in prenatal detection, so newborn screening provides a population-wide safety net for early diagnosis. A new web-based repository is collecting information on newborn screening program policies, quality indicators related to screening programs, and specific case-level data on infants with these defects. Birth defects surveillance programs also collect data about critical congenital heart defects, particularly related to diagnostic timing, mortality, and services. Individuals from state programs, federal agencies, and national organizations will be interested in these data to further refine algorithms for screening in normal newborn nurseries, neonatal intensive care settings, and other special populations; and ultimately to evaluate the impact of screening on outcomes. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheldon, S.R.; Muller, E.
Open disclosure and public understanding of major issues surrounding the Yucca Mountain Project is a consistent goal for Clark County, Nevada, which represents nearly 80 percent of Nevada's total population. Recent enhancements to the County's communication methods employ emerging technology as well as traditional public relations tactics. The County's communication methods engage the public through highly visual displays, exhibits, informative and entertaining video programs, school presentations, creative print inserts, public interaction and news media. The program provides information based on the county's research studies and findings on property values, the environment, tourism, public health and safety, increased costs for emergencymore » services and the potential disproportionate effects to Native American tribes and other minority populations in the area. Multi-cultural Dialogue: Nevada, particularly southern Nevada and the Las Vegas area, has experienced explosive growth in the last decade. The fastest growing demographic group in Nevada is Hispanics (nearly 23% in Las Vegas) and Asians (approx. 8%). Clark County's Nuclear Waste's Multi-cultural Program is designed to reach residents from these emerging segments of our population. Educational video programs: While officially opposed to the project, Clark County is committed to providing Nevada residents with accurate, timely and objective information about Yucca Mountain and its potential impacts to our state. Since the actual operation of the repository, if approved by the Nuclear Regulatory Commission, is about a decade away, the program includes presentations for middle and high school students on age-appropriate topics. Work with indigenous tribes: American Indian tribes in Southern Nevada participated in an unprecedented video program presenting the unique views and perspectives of the American Indian tribes directly impacted by the proposed repository. Monitoring program: To track economic, fiscal and social changes over time, the monitoring program is comprised of indicators in several core areas, including indicators of environmental, economic, community well being, fiscal, developmental and public health and safety. Its purpose is to highlight and monitor the most meaningful indicators of performance and perception in key service areas. The monitoring program is promoted within the public outreach program to make Nevada residents aware of this important resource of information. Internet Activities: Interactive quizzes, informational postings, electronic newsletters and pod-casts draw a demographic that prefers getting information from computer sources. Lively, interesting and ethnically diverse pod-cast episodes provide access to audio shows, which can be downloaded, to MP3 players or to a standard computer. (authors)« less
Lebo, Matthew S; Zakoor, Kathleen-Rose; Chun, Kathy; Speevak, Marsha D; Waye, John S; McCready, Elizabeth; Parboosingh, Jillian S; Lamont, Ryan E; Feilotter, Harriet; Bosdet, Ian; Tucker, Tracy; Young, Sean; Karsan, Aly; Charames, George S; Agatep, Ronald; Spriggs, Elizabeth L; Chisholm, Caitlin; Vasli, Nasim; Daoud, Hussein; Jarinova, Olga; Tomaszewski, Robert; Hume, Stacey; Taylor, Sherryl; Akbari, Mohammad R; Lerner-Ellis, Jordan
2018-03-01
PurposeThe purpose of this study was to develop a national program for Canadian diagnostic laboratories to compare DNA-variant interpretations and resolve discordant-variant classifications using the BRCA1 and BRCA2 genes as a case study.MethodsBRCA1 and BRCA2 variant data were uploaded and shared through the Canadian Open Genetics Repository (COGR; http://www.opengenetics.ca). A total of 5,554 variant observations were submitted; classification differences were identified and comparison reports were sent to participating laboratories. Each site had the opportunity to reclassify variants. The data were analyzed before and after the comparison report process to track concordant- or discordant-variant classifications by three different models.ResultsVariant-discordance rates varied by classification model: 38.9% of variants were discordant when using a five-tier model, 26.7% with a three-tier model, and 5.0% with a two-tier model. After the comparison report process, the proportion of discordant variants dropped to 30.7% with the five-tier model, to 14.2% with the three-tier model, and to 0.9% using the two-tier model.ConclusionWe present a Canadian interinstitutional quality improvement program for DNA-variant interpretations. Sharing of variant knowledge by clinical diagnostic laboratories will allow clinicians and patients to make more informed decisions and lead to better patient outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.K.; Cole, C.R.; Bond, F.W.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less
Annual Historical Summary, Defense Documentation Center, 1 July 1968 to 30 June 1969.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
This summary describes the more significant activities and achievements of the Defense Documentation Center (DDC) including: DDC and the scientific and technical community. The DDC role in the Department of Defense Scientific and Technical Information Program continued to shift from the traditional concept of an archival repository and a…
10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...
10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...
10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.
Code of Federal Regulations, 2014 CFR
2014-01-01
... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...
10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.
Code of Federal Regulations, 2011 CFR
2011-01-01
... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
... Criminal History Information Systems. The Department of Justice (DOJ), Office of Justice Programs, Bureau... collection for which approval has expired. (2) Title of the Form/Collection: Survey of State Criminal History... history records and on the increasing number of operations and services provided by state repositories. (5...
ERIC Educational Resources Information Center
Corlett, Bradly
2014-01-01
Several recent issues and trends in online education have resulted in consolidation of efforts for Massive Open Online Courses (MOOCs), increased Open Educational Resources (OER) in the form of asynchronous course repositories, with noticeable increases in governance and policy amplification. These emerging enrollment trends in alternative online…
Pretest characterization of WIPP experimental waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J.; Davis, H.; Drez, P.E.
The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is an underground repository designed for the storage and disposal of transuranic (TRU) wastes from US Department of Energy (DOE) facilities across the country. The Performance Assessment (PA) studies for WIPP address compliance of the repository with applicable regulations, and include full-scale experiments to be performed at the WIPP site. These experiments are the bin-scale and alcove tests to be conducted by Sandia National Laboratories (SNL). Prior to conducting these experiments, the waste to be used in these tests needs to be characterized to provide data on the initial conditionsmore » for these experiments. This characterization is referred to as the Pretest Characterization of WIPP Experimental Waste, and is also expected to provide input to other programmatic efforts related to waste characterization. The purpose of this paper is to describe the pretest waste characterization activities currently in progress for the WIPP bin-scale waste, and to discuss the program plan and specific analytical protocols being developed for this characterization. The relationship between different programs and documents related to waste characterization efforts is also highlighted in this paper.« less
Wynn, J.C.; Roseboom, E.H.
1987-01-01
Evaluation of potential high-level nuclear waste repository sites is an area where geophysical capabilities and limitations may significantly impact a major governmental program. Since there is concern that extensive exploratory drilling might degrade most potential disposal sites, geophysical methods become crucial as the only nondestructive means to examine large volumes of rock in three dimensions. Characterization of potential sites requires geophysicists to alter their usual mode of thinking: no longer are anomalies being sought, as in mineral exploration, but rather their absence. Thus the size of features that might go undetected by a particular method take on new significance. Legal and regulatory considerations that stem from this different outlook, most notably the requirements of quality assurance (necessary for any data used in support of a repository license application), are forcing changes in the manner in which geophysicists collect and document their data. -Authors
An Optimal Centralized Carbon Dioxide Repository for Florida, USA
Poiencot, Brandon; Brown, Christopher
2011-01-01
For over a decade, the United States Department of Energy, and engineers, geologists, and scientists from all over the world have investigated the potential for reducing atmospheric carbon emissions through carbon sequestration. Numerous reports exist analyzing the potential for sequestering carbon dioxide at various sites around the globe, but none have identified the potential for a statewide system in Florida, USA. In 2005, 83% of Florida’s electrical energy was produced by natural gas, coal, or oil (e.g., fossil fuels), from power plants spread across the state. In addition, only limited research has been completed on evaluating optimal pipeline transportation networks to centralized carbon dioxide repositories. This paper describes the feasibility and preliminary locations for an optimal centralized Florida-wide carbon sequestration repository. Linear programming optimization modeling is used to plan and route an idealized pipeline network to existing Florida power plants. Further analysis of the subsurface geology in these general locations will provide insight into the suitability of the subsurface conditions and the available capacity for carbon sequestration at selected possible repository sites. The identification of the most favorable site(s) is also presented. PMID:21695024
An optimal centralized carbon dioxide repository for Florida, USA.
Poiencot, Brandon; Brown, Christopher
2011-04-01
For over a decade, the United States Department of Energy, and engineers, geologists, and scientists from all over the world have investigated the potential for reducing atmospheric carbon emissions through carbon sequestration. Numerous reports exist analyzing the potential for sequestering carbon dioxide at various sites around the globe, but none have identified the potential for a statewide system in Florida, USA. In 2005, 83% of Florida's electrical energy was produced by natural gas, coal, or oil (e.g., fossil fuels), from power plants spread across the state. In addition, only limited research has been completed on evaluating optimal pipeline transportation networks to centralized carbon dioxide repositories. This paper describes the feasibility and preliminary locations for an optimal centralized Florida-wide carbon sequestration repository. Linear programming optimization modeling is used to plan and route an idealized pipeline network to existing Florida power plants. Further analysis of the subsurface geology in these general locations will provide insight into the suitability of the subsurface conditions and the available capacity for carbon sequestration at selected possible repository sites. The identification of the most favorable site(s) is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-09-01
Radioactive waste is mounting at U.S. nuclear power plants at a rate of more than 2,000 metric tons a year. Pursuant to statute and anticipating that a geologic repository would be available in 1998, the Department of Energy (DOE) entered into disposal contracts with nuclear utilities. Now, however, DOE does not expect the repository to be ready before 2010. For this reason, DOE does not want to develop a facility for monitored retrievable storage (MRS) by 1998. This book is concerned about how best to store the waste until a repository is available, congressional requesters asked GAO to review themore » alternatives of continued storage at utilities' reactor sites or transferring waste to an MRS facility, GAO assessed the likelihood of an MRSA facility operating by 1998, legal implications if DOE is not able to take delivery of wastes in 1998, propriety of using the Nuclear Waste Fund-from which DOE's waste program costs are paid-to pay utilities for on-site storage capacity added after 1998, ability of utilities to store their waste on-site until a repository is operating, and relative costs and safety of the two storage alternatives.« less
Bellman’s GAP—a language and compiler for dynamic programming in sequence analysis
Sauthoff, Georg; Möhl, Mathias; Janssen, Stefan; Giegerich, Robert
2013-01-01
Motivation: Dynamic programming is ubiquitous in bioinformatics. Developing and implementing non-trivial dynamic programming algorithms is often error prone and tedious. Bellman’s GAP is a new programming system, designed to ease the development of bioinformatics tools based on the dynamic programming technique. Results: In Bellman’s GAP, dynamic programming algorithms are described in a declarative style by tree grammars, evaluation algebras and products formed thereof. This bypasses the design of explicit dynamic programming recurrences and yields programs that are free of subscript errors, modular and easy to modify. The declarative modules are compiled into C++ code that is competitive to carefully hand-crafted implementations. This article introduces the Bellman’s GAP system and its language, GAP-L. It then demonstrates the ease of development and the degree of re-use by creating variants of two common bioinformatics algorithms. Finally, it evaluates Bellman’s GAP as an implementation platform of ‘real-world’ bioinformatics tools. Availability: Bellman’s GAP is available under GPL license from http://bibiserv.cebitec.uni-bielefeld.de/bellmansgap. This Web site includes a repository of re-usable modules for RNA folding based on thermodynamics. Contact: robert@techfak.uni-bielefeld.de Supplementary information: Supplementary data are available at Bioinformatics online PMID:23355290
NASA Astrophysics Data System (ADS)
Burgasser, Adam
The NASA Infrared Telescope Facility's (IRTF) SpeX spectrograph has been an essential tool in the discovery and characterization of ultracool dwarf (UCD) stars, brown dwarfs and exoplanets. Over ten years of SpeX data have been collected on these sources, and a repository of low-resolution (R 100) SpeX prism spectra has been maintained by the PI at the SpeX Prism Spectral Libraries website since 2008. As the largest existing collection of NIR UCD spectra, this repository has facilitated a broad range of investigations in UCD, exoplanet, Galactic and extragalactic science, contributing to over 100 publications in the past 6 years. However, this repository remains highly incomplete, has not been uniformly calibrated, lacks sufficient contextual data for observations and sources, and most importantly provides no data visualization or analysis tools for the user. To fully realize the scientific potential of these data for community research, we propose a two-year program to (1) calibrate and expand existing repository and archival data, and make it virtual-observatory compliant; (2) serve the data through a searchable web archive with basic visualization tools; and (3) develop and distribute an open-source, Python-based analysis toolkit for users to analyze the data. These resources will be generated through an innovative, student-centered research model, with undergraduate and graduate students building and validating the analysis tools through carefully designed coding challenges and research validation activities. The resulting data archive, the SpeX Prism Library, will be a legacy resource for IRTF and SpeX, and will facilitate numerous investigations using current and future NASA capabilities. These include deep/wide surveys of UCDs to measure Galactic structure and chemical evolution, and probe UCD populations in satellite galaxies (e.g., JWST, WFIRST); characterization of directly imaged exoplanet spectra (e.g., FINESSE), and development of low-temperature theoretical models of UCD and exoplanet atmospheres. Our program will also serve to validate the IRTF data archive during its development, by reducing and disseminating non-proprietary archival observations of UCDs to the community. The proposed program directly addresses NASA's strategic goals of exploring the origin and evolution of stars and planets that make up our universe, and discovering and studying planets around other stars.
Development of DKB ETL module in case of data conversion
NASA Astrophysics Data System (ADS)
Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.
2018-05-01
Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.
Copper Corrosion in Nuclear Waste Disposal: A Swedish Case Study on Stakeholder Insight
ERIC Educational Resources Information Center
Andersson, Kjell
2013-01-01
The article describes the founding principles, work program, and accomplishments of a Reference Group with both expert and layperson stakeholders for the corrosion of copper canisters in a proposed deep repository in Sweden for spent nuclear fuel. The article sets the Reference Group as a participatory effort within a broader context of…
Poster Puzzler Solution: Chill Out | Poster
A winner has emerged in the most recent Poster Puzzler contest! Congratulations are in order for Rose Bradley, secretary III, Cancer Research Technology Program. The current Poster Puzzler image shows the refrigerant condensers for the two story freezers in the Building 1073 repository, which are used to store samples at -20°C. Put simply, the condensers act like the outdoor
ERIC Educational Resources Information Center
Anderson, Talea
2015-01-01
In 2013-2014, Brooks Library at Central Washington University (CWU) launched library content in three systems: a digital asset-management system, an institutional repository (IR), and a web-based discovery layer. In early 2014, the archives at the library began to use these systems to disseminate media recently digitized from legacy formats. As…
Library and Archival Security: Policies and Procedures To Protect Holdings from Theft and Damage.
ERIC Educational Resources Information Center
Trinkaus-Randall, Gregor
1998-01-01
Firm policies and procedures that address the environment, patron/staff behavior, general attitude, and care and handling of materials need to be at the core of the library/archival security program. Discussion includes evaluating a repository's security needs, collections security, security in non-public areas, security in the reading room,…
USDA-ARS?s Scientific Manuscript database
USDA-ARS SHRS is part of the USDA National Germplasm Repository system and houses collections of tropical and subtropical fruit trees such as mango, lychee, and avocado. In addition to maintaining the germplasm collections, our mission is to also identify genetic diversity in the collections, to ev...
Phase Stability Determinations of DWPF Waste Glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marra, S.L.
1999-10-22
Liquid high-level nuclear waste will be immobilized at the Savannah River Site (SRS) by vitrification in borosilicate glass. To fulfill this requirement, glass samples were heat treated at various times and temperatures. These results will provide guidance to the repository program about conditions to be avoided during shipping, handling and storage of DWPF canistered waste forms.
The PubChem Bioassay database is a non-curated public repository with bioactivity data from 64 sources, including: ChEMBL, BindingDb, DrugBank, Tox21, NIH Molecular Libraries Screening Program, and various academic, government, and industrial contributors. However, this data is d...
NASA Astrophysics Data System (ADS)
Himmelberger, Jeffery J.; Baughman, Mike; Ogneva-Himmelberger, Yelena A.
1995-11-01
Whether the proposed Yucca Mountain nuclear waste repository system will adversely impact tourism in southern Nevada is an open question of particular importance to visitor-oriented rural counties bisected by planned waste transportatin corridors (highway or rail). As part of one such county's repository impact assessment program, tourism implications of Three Mile Island (TMI) and other major hazard events have beem revisited to inform ongoing county-wide socioeconomic assessments and contingency planning efforts. This paper summarizes key research implications of such research as applied to Lincoln County, Nevada. Implications for other rural counties are discussed in light of the research findings.
A Discussion of Issues in Integrity Constraint Monitoring
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.; Gates, Ann Q.; Cooke, Daniel E.
1998-01-01
In the development of large-scale software systems, analysts, designers, and programmers identify properties of data objects in the system. The ability to check those assertions during runtime is desirable as a means of verifying the integrity of the program. Typically, programmers ensure the satisfaction of such properties through the use of some form of manually embedded assertion check. The disadvantage to this approach is that these assertions become entangled within the program code. The goal of the research is to develop an integrity constraint monitoring mechanism whereby a repository of software system properties (called integrity constraints) are automatically inserted into the program by the mechanism to check for incorrect program behaviors. Such a mechanism would overcome many of the deficiencies of manually embedded assertion checks. This paper gives an overview of the preliminary work performed toward this goal. The manual instrumentation of constraint checking on a series of test programs is discussed, This review then is used as the basis for a discussion of issues to be considered in developing an automated integrity constraint monitor.
Antarctica and global change research
NASA Astrophysics Data System (ADS)
Weller, Gunter; Lange, Manfred
1992-03-01
The Antarctic, including the continent and Southern Ocean with the subantarctic islands, is a critical area in the global change studies under the International Geosphere-Biosphere Program (IGBP) and the World Climate Research Program (WCRP). Major scientific problems include the impacts of climate warming, the ozone hole, and sea level changes. Large-scale interactions between the atmosphere, ice, ocean, and biota in the Antarctic affect the entire global system through feedbacks, biogeochemical cycles, deep-ocean circulation, atmospheric transport of heat, moisture, and pollutants, and changes in ice mass balances. Antarctica is also a rich repository of paleoenvironmental information in its ice sheet and its ocean and land sediments.
The IDL astronomy user's library
NASA Technical Reports Server (NTRS)
Landsman, W. B.
1992-01-01
IDL (Interactive Data Language) is a commercial programming, plotting, and image display language, which is widely used in astronomy. The IDL Astronomy User's Library is a central repository of over 400 astronomy-related IDL procedures accessible via anonymous FTP. The author will overview the use of IDL within the astronomical community and discuss recent enhancements at the IDL astronomy library. These enhancements include a fairly complete I/O package for FITS images and tables, an image deconvolution package and an image mosaic package, and access to IDL Open Windows/Motif widgets interface. The IDL Astronomy Library is funded by NASA through the Astrophysics Software and Research Aids Program.
Importance of Data Management in a Long-term Biological Monitoring Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty
2011-01-01
The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less
Rocky Flats Environmental Technology Site Ecological Monitoring Program 1995 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-05-31
The Ecological Monitoring Program (ECMP) was established at the Rocky Flats Environmental Technology Site (Site) in September 1992. At that time, EcMP staff developed a Program Plan that was peer-reviewed by scientists from western universities before submittal to DOE RFFO in January 1993. The intent of the program is to measure several quantitative variables at different ecological scales in order to characterize the Rocky Flats ecosystem. This information is necessary to document ecological conditions at the Site in impacted and nonimpacted areas to determine if Site practices have had ecological impacts, either positive or negative. This information can be usedmore » by managers interested in future use scenarios and CERCLA activities. Others interested in impact analysis may also find the information useful. In addition, these measurements are entered into a database which will serve as a long-term information repository that will document long-term trends and potential future changes to the Site, both natural and anthropogenic.« less
If I Had a Hammer (and Several Million Dollars): The Saga of the AIHEC Cultural Learning Centers.
ERIC Educational Resources Information Center
Edinger, Anne; Ambler, Marjane
2002-01-01
Presents an interview with Gail Bruce and Anne Ediger, who, in the early 1990s, conceived the idea of building cultural centers on 30 tribal college campuses. States that they imagined the centers would simply serve as repositories for Indian artifacts; however, after years of fund-raising efforts and program obstacles, the buildings transformed…
State Assessment Program Item Banks: Model Language for Request for Proposals (RFP) and Contracts
ERIC Educational Resources Information Center
Swanson, Leonard C.
2010-01-01
This document provides recommendations for request for proposal (RFP) and contract language that state education agencies can use to specify their requirements for access to test item banks. An item bank is a repository for test items and data about those items. Item banks are used by state agency staff to view items and associated data; to…
Challenges in Developing XML-Based Learning Repositories
NASA Astrophysics Data System (ADS)
Auksztol, Jerzy; Przechlewski, Tomasz
There is no doubt that modular design has many advantages, including the most important ones: reusability and cost-effectiveness. In an e-leaming community parlance the modules are determined as Learning Objects (LOs) [11]. An increasing amount of learning objects have been created and published online, several standards has been established and multiple repositories developed for them. For example Cisco Systems, Inc., "recognizes a need to move from creating and delivering large inflexible training courses, to database-driven objects that can be reused, searched, and modified independent of their delivery media" [6]. The learning object paradigm of education resources authoring is promoted mainly to reduce the cost of the content development and to increase its quality. A frequently used metaphor of Learning Objects paradigm compares them to Lego Logs or objects in Object-Oriented program design [25]. However a metaphor is only an abstract idea, which should be turned to something more concrete to be usable. The problem is that many papers on LOs end up solely in metaphors. In our opinion Lego or OO metaphors are gross oversimplificatation of the problem as there is much easier to develop Lego set or design objects in OO program than develop truly interoperable, context-free learning content1.
Can shale safely host US nuclear waste?
Neuzil, C.E.
2013-01-01
"Even as cleanup efforts after Japan’s Fukushima disaster offer a stark reminder of the spent nuclear fuel (SNF) stored at nuclear plants worldwide, the decision in 2009 to scrap Yucca Mountain as a permanent disposal site has dimmed hope for a repository for SNF and other high-level nuclear waste (HLW) in the United States anytime soon. About 70,000 metric tons of SNF are now in pool or dry cask storage at 75 sites across the United States [Government Accountability Office, 2012], and uncertainty about its fate is hobbling future development of nuclear power, increasing costs for utilities, and creating a liability for American taxpayers [Blue Ribbon Commission on America’s Nuclear Future, 2012].However, abandoning Yucca Mountain could also result in broadening geologic options for hosting America’s nuclear waste. Shales and other argillaceous formations (mudrocks, clays, and similar clay-rich media) have been absent from the U.S. repository program. In contrast, France, Switzerland, and Belgium are now planning repositories in argillaceous formations after extensive research in underground laboratories on the safety and feasibility of such an approach [Blue Ribbon Commission on America’s Nuclear Future, 2012; Nationale Genossenschaft für die Lagerung radioaktiver Abfälle (NAGRA), 2010; Organisme national des déchets radioactifs et des matières fissiles enrichies, 2011]. Other nations, notably Japan, Canada, and the United Kingdom, are studying argillaceous formations or may consider them in their siting programs [Japan Atomic Energy Agency, 2012; Nuclear Waste Management Organization (NWMO), (2011a); Powell et al., 2010]."
Object linking in repositories
NASA Technical Reports Server (NTRS)
Eichmann, David (Editor); Beck, Jon; Atkins, John; Bailey, Bill
1992-01-01
This topic is covered in three sections. The first section explores some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life cycle of software development. A model is considered that provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The second section gives a description of the efforts to implement the repository architecture using a commercially available object-oriented database management system. Some of the features of this implementation are described, and some of the next steps to be taken to produce a working prototype of the repository are pointed out. In the final section, it is argued that design and instantiation of reusable components have competing criteria (design-for-reuse strives for generality, design-with-reuse strives for specificity) and that providing mechanisms for each can be complementary rather than antagonistic. In particular, it is demonstrated how program slicing techniques can be applied to customization of reusable components.
Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1989-12-01
In the Report of the Senate Committee on Appropriations accompanying the Energy and Water Appropriation Act for 1989, the Committee directed the Department of Energy (DOE) to evaluate the use of lead in the waste packages to be used in geologic repositories for spent nuclear fuel and high-level waste. The evaluation that was performed in response to this directive is presented in this report. This evaluation was based largely on a review of the technical literature on the behavior of lead, reports of work conducted in other countries, and work performed for the waste-management program being conducted by the DOE.more » The initial evaluation was limited to the potential use of lead in the packages to be used in the repository. Also, the focus of this report is post closure performance and not on retrievability and handling aspects of the waste package. 100 refs., 8 figs., 15 tabs.« less
The United States Polar Rock Repository: A geological resource for the Earth science community
Grunow, Annie M.; Elliot, David H.; Codispoti, Julie E.
2007-01-01
The United States Polar Rock Repository (USPRR) is a U. S. national facility designed for the permanent curatorial preservation of rock samples, along with associated materials such as field notes, annotated air photos and maps, raw analytic data, paleomagnetic cores, ground rock and mineral residues, thin sections, and microfossil mounts, microslides and residues from Polar areas. This facility was established by the Office of Polar Programs at the U. S. National Science Foundation (NSF) to minimize redundant sample collecting, and also because the extreme cold and hazardous field conditions make fieldwork costly and difficult. The repository provides, along with an on-line database of sample information, an essential resource for proposal preparation, pilot studies and other sample based research that should make fieldwork more efficient and effective. This latter aspect should reduce the environmental impact of conducting research in sensitive Polar Regions. The USPRR also provides samples for educational outreach. Rock samples may be borrowed for research or educational purposes as well as for museum exhibits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slovic, P.; Layman, M.; Kraus, N.N.
1989-07-01
This paper describes a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada, upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. Adverse economic impacts may be expected to result from two related social processes. One has to do with perceptions of risk and socially amplified reactions to ``unfortunate events`` associated with the repository (major and minor accidents, discoveries of radiation releases, evidence of mismanagement, attempts to sabotage or disrupt the facility, etc.). The second process that may trigger significant adverse impacts is thatmore » of stigmatization. The conceptual underpinnings of risk perception, social amplification, and stigmatization are discussed in this paper and empirical data are presented to demonstrate how nuclear images associated with Las Vegas and the State of Nevada might trigger adverse effects on tourism, migration, and business development.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1988-12-01
This site characterization plan (SCP) has been developed for the candidate repository site at Yucca Mountain in the State of Nevada. The SCP includes a description of the Yucca Mountain site (Chapters 1-5), a conceptual design for the repository (Chapter 6), a description of the packaging to be used for the waste to be emplaced in the repository (Chapter 7), and a description of the planned site characterization activities (Chapter 8). The schedules and milestones presented in Sections 8.3 and 8.5 of the SCP were developed to be consistent with the June 1988 draft Amendment to the DOE`s Mission Planmore » for the Civilian Radioactive Waste Management Program. The five month delay in the scheduled start of exploratory shaft construction that was announced recently is not reflected in these schedules. 68 figs., 102 tabs.« less
Implementation of the Brazilian National Repository - RBMN Project - 13008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cassia Oliveira de Tello, Cledola
2013-07-01
Ionizing radiation in Brazil is used in electricity generation, medicine, industry, agriculture and for research and development purposes. All these activities can generate radioactive waste. At this point, in Brazil, the use of nuclear energy and radioisotopes justifies the construction of a national repository for radioactive wastes of low and intermediate-level. According to Federal Law No. 10308, Brazilian National Commission for Nuclear Energy (CNEN) is responsible for designing and constructing the intermediate and final storages for radioactive wastes. Additionally, a restriction on the construction of Angra 3 is that the repository is under construction until its operation start, attaining somemore » requirements of the Brazilian Environmental Regulator (IBAMA). Besides this NPP, in the National Energy Program is previewed the installation of four more plants, by 2030. In November 2008, CNEN launched the Project RBMN (Repository for Low and Intermediate-Level Radioactive Wastes), which aims at the implantation of a National Repository for disposal of low and intermediate-level of radiation wastes. This Project has some aspects that are unique in the Brazilian context, especially referring to the time between its construction and the end of its institutional period. This time is about 360 years, when the area will be released for unrestricted uses. It means that the Repository must be safe and secure for more than three hundred years, which is longer than half of the whole of Brazilian history. This aspect is very new for the Brazilian people, bringing a new dimension to public acceptance. Another point is this will be the first repository in South America, bringing a real challenge for the continent. The current status of the Project is summarized. (authors)« less
Kamal, Jyoti; Liu, Jianhua; Ostrander, Michael; Santangelo, Jennifer; Dyta, Ravi; Rogers, Patrick; Mekhjian, Hagop S
2010-11-13
Since its inception in 1997, the IW (Information Warehouse) at the Ohio State University Medical Center (OSUMC) has gradually transformed itself from a single purpose business decision support system to a comprehensive informatics platform supporting basic, clinical, and translational research. The IW today is the combination of four integrated components: a clinical data repository containing over a million patients; a research data repository housing various research specific data; an application development platform for building business and research enabling applications; a business intelligence environment assisting in reporting in all function areas. The IW is structured and encoded using standard terminologies such as SNOMED-CT, ICD, and CPT. The IW is an important component of OSUMC's Clinical and Translational Science Award (CTSA) informatics program.
Utilizing the Antarctic Master Directory to find orphan datasets
NASA Astrophysics Data System (ADS)
Bonczkowski, J.; Carbotte, S. M.; Arko, R. A.; Grebas, S. K.
2011-12-01
While most Antarctic data are housed at an established disciplinary-specific data repository, there are data types for which no suitable repository exists. In some cases, these "orphan" data, without an appropriate national archive, are served from local servers by the principal investigators who produced the data. There are many pitfalls with data served privately, including the frequent lack of adequate documentation to ensure the data can be understood by others for re-use and the impermanence of personal web sites. For example, if an investigator leaves an institution and the data moves, the link published is no longer accessible. To ensure continued availability of data, submission to long-term national data repositories is needed. As stated in the National Science Foundation Office of Polar Programs (NSF/OPP) Guidelines and Award Conditions for Scientific Data, investigators are obligated to submit their data for curation and long-term preservation; this includes the registration of a dataset description into the Antarctic Master Directory (AMD), http://gcmd.nasa.gov/Data/portals/amd/. The AMD is a Web-based, searchable directory of thousands of dataset descriptions, known as DIF records, submitted by scientists from over 20 countries. It serves as a node of the International Directory Network/Global Change Master Directory (IDN/GCMD). The US Antarctic Program Data Coordination Center (USAP-DCC), http://www.usap-data.org/, funded through NSF/OPP, was established in 2007 to help streamline the process of data submission and DIF record creation. When data does not quite fit within any existing disciplinary repository, it can be registered within the USAP-DCC as the fallback data repository. Within the scope of the USAP-DCC we undertook the challenge of discovering and "rescuing" orphan datasets currently registered within the AMD. In order to find which DIF records led to data served privately, all records relating to US data within the AMD were parsed. After identifying the records containing a URL leading to a national data center or other disciplinary data repository, the remaining records were individually inspected for data type, format, and quality of metadata and then assessed to determine how best to preserve. Of the records reviewed, those for which appropriate repositories could be identified were submitted. An additional 35 were deemed acceptable in quality of metadata to register in the USAP-DCC. The content of these datasets were varied in nature, ranging from penguin counts to paleo-geologic maps to results of meteorological models all of which are discoverable through our search interface, http://www.usap-data.org/search.php. The remaining 40 records linked to either no data or had inadequate documentation for preservation highlighting the danger of serving datasets on local servers where minimal metadata standards can not be enforced and long-term access can not be ensured.
Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results
NASA Astrophysics Data System (ADS)
Nussbaum, C. O.; Bossart, P. J.
2012-12-01
Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.
Network Configuration of Oracle and Database Programming Using SQL
NASA Technical Reports Server (NTRS)
Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.
2000-01-01
A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).
Zaver, Fareen; Battaglioli, Nicole; Denq, William; Messman, Anne; Chung, Arlene; Lin, Michelle; Liu, Emberlynn L
2018-03-01
Burnout, depression, and suicidality among residents of all specialties have become a critical focus for the medical education community, especially among learners in graduate medical education. In 2017 the Accreditation Council for Graduate Medical Education (ACGME) updated the Common Program Requirements to focus more on resident wellbeing. To address this issue, one working group from the 2017 Resident Wellness Consensus Summit (RWCS) focused on wellness program innovations and initiatives in emergency medicine (EM) residency programs. Over a seven-month period leading up to the RWCS event, the Programmatic Initiatives workgroup convened virtually in the Wellness Think Tank, an online, resident community consisting of 142 residents from 100 EM residencies in North America. A 15-person subgroup (13 residents, two faculty facilitators) met at the RWCS to develop a public, central repository of initiatives for programs, as well as tools to assist programs in identifying gaps in their overarching wellness programs. An online submission form and central database of wellness initiatives were created and accessible to the public. Wellness Think Tank members collected an initial 36 submissions for the database by the time of the RWCS event. Based on general workplace, needs-assessment tools on employee wellbeing and Kern's model for curriculum development, a resident-based needs-assessment survey and an implementation worksheet were created to assist residency programs in wellness program development. The Programmatic Initiatives workgroup from the resident-driven RWCS event created tools to assist EM residency programs in identifying existing initiatives and gaps in their wellness programs to meet the ACGME's expanded focus on resident wellbeing.
Charting a Path to Location Intelligence for STD Control.
Gerber, Todd M; Du, Ping; Armstrong-Brown, Janelle; McNutt, Louise-Anne; Coles, F Bruce
2009-01-01
This article describes the New York State Department of Health's GeoDatabase project, which developed new methods and techniques for designing and building a geocoding and mapping data repository for sexually transmitted disease (STD) control. The GeoDatabase development was supported through the Centers for Disease Control and Prevention's Outcome Assessment through Systems of Integrated Surveillance workgroup. The design and operation of the GeoDatabase relied upon commercial-off-the-shelf tools that other public health programs may also use for disease-control systems. This article provides a blueprint of the structure and software used to build the GeoDatabase and integrate location data from multiple data sources into the everyday activities of STD control programs.
Continental Scientific Drilling Program Data Base
NASA Astrophysics Data System (ADS)
Pawloski, Gayle
The Continental Scientific Drilling Program (CSDP) data base at Lawrence Livermore National Laboratory is a central repository, cataloguing information from United States drill holes. Most holes have been drilled or proposed by various federal agencies. Some holes have been commercially funded. This data base is funded by the Office of Basic Energy Sciences of t he Department of Energy (OBES/DOE) to serve the entire scientific community. Through the unrestricted use of the database, it is possible to reduce drilling costs and maximize the scientific value of current and planned efforts of federal agencies and industry by offering the opportunity for add-on experiments and supplementing knowledge with additional information from existing drill holes.
McClennen, Seth; Nathanson, Larry A; Safran, Charles; Goldberger, Ary L
2003-12-01
To create a multimedia internet-based ECG teaching tool, with the ability to rapidly incorporate new clinical cases. We created ECG Wave-Maven ( http://ecg.bidmc.harvard.edu ), a novel teaching tool with a direct link to an institution-wide clinical repository. We analyzed usage data from the web between December, 2000 and May 2002. In 17 months, there have been 4105 distinct uses of the program. A majority of users are physicians or medical students (2605, 63%), and almost half report use as an educational tool. The internet offers an opportunity to provide easily-expandable, open access resources for ECG pedagogy which may be used to complement traditional methods of instruction.
Disease management programs for the underserved.
Horswell, Ronald; Butler, Michael K; Kaiser, Michael; Moody-Thomas, Sarah; McNabb, Shannon; Besse, Jay; Abrams, Amir
2008-06-01
Disease management has become an important tool for improving population patient outcomes. The Louisiana State University Health Care Services Division (HCSD) has used this tool to provide care to a largely uninsured population for approximately 10 years. Eight programs currently exist within the HCSD focusing on diabetes, asthma, congestive heart failure, HIV, cancer screening, smoking cessation, chronic kidney disease, and diet, exercise, and weight control. These programs operate at hospital and clinic sites located in 8 population centers throughout southern Louisiana. The programs are structured to be managed at the system level with a clinical expert for each area guiding the scope of the program and defining new goals. Care largely adheres to evidence-based guidelines set forth by professional organizations. To monitor quality of care, indicators are defined within each area and benchmarked to achieve the most effective measures in our population. For example, hemoglobin A1c levels have shown improvements with nearly 54% of the population <7.0%. To support these management efforts, HCSD utilizes an electronic data repository that allows physicians to track patient labs and other tests as well as reminders. To ensure appropriate treatment, patients are able to enroll in the Medication Assistance program. This largely improves adherence to medications for those patients unable to afford them otherwise.
Importance of Data Management in a Long-Term Biological Monitoring Program
NASA Astrophysics Data System (ADS)
Christensen, Sigurd W.; Brandt, Craig C.; McCracken, Mary K.
2011-06-01
The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to meeting this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program when an existing relational database was adapted and extended to handle biological data. The database's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. We also discuss some limitations to our implementation. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.
Depleted UF6 Management Information Network - A resource for the public,
Depleted UF6 Management Information Network Web Site is an online repository of information about the U.S ) and DUF6, research and development efforts for beneficial uses of DU, DOE's program for management of line DUF6 Guide DUF6 Guide line Introductory information about depleted uranium: how it is created
Characterization of Heat-treated Clay Minerals in the Context of Nuclear Waste Disposal
NASA Astrophysics Data System (ADS)
Matteo, E. N.; Wang, Y.; Kruichak, J. N.; Mills, M. M.
2015-12-01
Clay minerals are likely candidates to aid in nuclear waste isolation due to their low permeability, favorable swelling properties, and high cation sorption capacities. Establishing the thermal limit for clay minerals in a nuclear waste repository is a potentially important component of repository design, as flexibility of the heat load within the repository can have a major impact on the selection of repository design. For example, the thermal limit plays a critical role in the time that waste packages would need to cool before being transferred to the repository. Understanding the chemical and physical changes, if any, that occur in clay minerals at various temperatures above the current thermal limit (of 100 °C) can enable decision-makers with information critical to evaluating the potential trade-offs of increasing the thermal limit within the repository. Most critical is gaining understanding of how varying thermal conditions in the repository will impact radionuclide sorption and transport in clay materials either as engineered barriers or as disposal media. A variety of repository-relevant clay minerals (illite, mixed layer illite/smectite, and montmorillonite), were heated for a range of temperatures between 100-1000 °C. These samples were characterized to determine surface area, mineralogical alteration, and cation exchange capacity (CEC). Our results show that for conditions up to 500 °C, no significant change occurs, so long as the clay mineral remains mineralogically intact. At temperatures above 500 °C, transformation of the layered silicates into silica phases leads to alteration that impacts important clay characteristics. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's Nation Nuclear Security Administration under contract DE-AC04-94AL85000. SAND Number: SAND2015-6524 A
Perspectives of Future R and D on HLW Disposal in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steininger, W.J.
2008-07-01
The 5. Energy Research Program of the Federal Government 'Innovation and New Technology' is the general framework for R and D activities in radioactive waste disposal. The Ministry of Economics and Technology (BMWi), the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU) and the Ministry of Education and Research (BMBF) apply the Research Program concerning their respective responsibilities and competences. With regard to the Government's obligation to provide repositories for HLW (spent fuel and vitrified HAW) radioactive waste basic and applied R and D is needed in order to make adequate knowledge available to implementers, decision makersmore » and stakeholders in general. Non-site specific R and D projects are funded by BMWi on the basis of its Research Concept. In the first stage (1998 -2001) most R and D issues were focused on R and D activities related to HLW disposal in rock salt. By that time the R and D program had to be revised and some prioritization was demanded due to changes in politics. In the current version (2001 -2006) emphasize was put on non-saline rocks. The current Research Concept of BMWi is presently subjected to a sort of revision, evaluation, and discussion, inter alia, by experts from several German research institutions. This activity is of special importance against the background of streamlining and focusing the research activities to future demands, priorities and perspectives with regard to the salt concept and the option of disposing of HLW in argillaceous media. Because the status of knowledge on disposal in rock salt is well advanced, it is necessary to take stock of the current state-of-the-art. In this framework some key projects are being currently carried out. The results may contribute to future decisions to be made in Germany with respect to HLW disposal. The first project deals with the development of an advanced safety concept for a HLW waste repository in rock salt. The second project (also carried out in the frame of the 6. Framework Program of the European Commission) aims at completing and optimizing the direct disposal concept for spent fuel by a full-scale demonstration of the technology of emplacement in vertical boreholes. The third project is devoted to the development of a reference concept to dispose of HLW in deep geological repository in clay in Germany. In the following a brief overview is given on the achievements, the projects, and ideas about the consequences for HLW disposal in Germany. (author)« less
The National Diabetes Education Program at 20 Years: Lessons Learned and Plans for the Future.
Siminerio, Linda M; Albright, Ann; Fradkin, Judith; Gallivan, Joanne; McDivitt, Jude; Rodríguez, Betsy; Tuncer, Diane; Wong, Faye
2018-02-01
The National Diabetes Education Program (NDEP) was established to translate findings from diabetes research studies into clinical and public health practice. Over 20 years, NDEP has built a program with partnership engagement that includes science-based resources for multiple population and stakeholder audiences. Throughout its history, NDEP has developed strategies and messages based on communication research and relied on established behavior change models from health education, communication, and social marketing. The program's success in continuing to engage diverse partners after 20 years has led to time-proven and high-quality resources that have been sustained. Today, NDEP maintains a national repository of diabetes education tools and resources that are high quality, science- and audience-based, culturally and linguistically appropriate, and available free of charge to a wide variety of audiences. This review looks back and describes NDEP's evolution in transforming and communicating diabetes management and type 2 diabetes prevention strategies through partnerships, campaigns, educational resources, and tools and identifies future opportunities and plans. © 2018 by the American Diabetes Association.
National Survey of US academic anesthesiology chairs on clinician wellness.
Vinson, Amy E; Zurakowski, David; Randel, Gail I; Schlecht, Kathy D
2016-11-01
The prevalence of anesthesiology department wellness programs is unknown. A database of wellness programs is needed as a resource for departments attempting to respond to the Accreditation Council for Graduate Medical Education Anesthesiology Milestones Project. The purpose of this study was to survey academic anesthesiology chairs on wellness issues, characterize initiatives, and establish wellness contacts for a Wellness Initiative Database (WID). An Internet-based survey instrument was distributed to academic anesthesiology department chairs in the United States. On-line. None. None. Analysis for continuous variables used standard means, modes, and averages for individual responses; 95% confidence intervals for proportions were calculated by Wilson's method. Seventy-five (56.4%) responses (of a potential 133 programs) were obtained. Forty-one (of 71 responders; 57.8%) expressed interest in participating in a WID, and 33 (44%) provided contact information. Most (74.7%) had recently referred staff for counseling or wellness resources, yet many (79.5% and 67.1%, respectively) had never surveyed their department's interest in wellness resources. Thirty-four percent had a wellness resources repository. Of 22 wellness topics, 8 garnered >60% strong interest from respondents: Addiction Counseling, Sleep Hygiene, Peer Support Program, Stress Management, Conflict Management, Burnout Counseling, Time Management, and Dealing with Adverse Events Training. There was a statistically significant difference in interest between those willing to participate or not in the WID across most topics but no significant difference based on need for recent staff referral. The majority of chairs needed to recently refer a department member to wellness resources or counseling. Most were interested in participating in a WID, whereas a minority had gauged staff interest in wellness topics or had a wellness resource repository. Highest interest was in topics most related to function as an anesthesiologist. Those willing to participate in the database had statistically significant differences in interest across most wellness topics. Copyright © 2016 Elsevier Inc. All rights reserved.
Operations research applications in nuclear energy
NASA Astrophysics Data System (ADS)
Johnson, Benjamin Lloyd
This dissertation consists of three papers; the first is published in Annals of Operations Research, the second is nearing submission to INFORMS Journal on Computing, and the third is the predecessor of a paper nearing submission to Progress in Nuclear Energy. We apply operations research techniques to nuclear waste disposal and nuclear safeguards. Although these fields are different, they allow us to showcase some benefits of using operations research techniques to enhance nuclear energy applications. The first paper, "Optimizing High-Level Nuclear Waste Disposal within a Deep Geologic Repository," presents a mixed-integer programming model that determines where to place high-level nuclear waste packages in a deep geologic repository to minimize heat load concentration. We develop a heuristic that increases the size of solvable model instances. The second paper, "Optimally Configuring a Measurement System to Detect Diversions from a Nuclear Fuel Cycle," introduces a simulation-optimization algorithm and an integer-programming model to find the best, or near-best, resource-limited nuclear fuel cycle measurement system with a high degree of confidence. Given location-dependent measurement method precisions, we (i) optimize the configuration of n methods at n locations of a hypothetical nuclear fuel cycle facility, (ii) find the most important location at which to improve method precision, and (iii) determine the effect of measurement frequency on near-optimal configurations and objective values. Our results correspond to existing outcomes but we obtain them at least an order of magnitude faster. The third paper, "Optimizing Nuclear Material Control and Accountability Measurement Systems," extends the integer program from the second paper to locate measurement methods in a larger, hypothetical nuclear fuel cycle scenario given fixed purchase and utilization budgets. This paper also presents two mixed-integer quadratic programming models to increase the precision of existing methods given a fixed improvement budget and to reduce the measurement uncertainty in the system while limiting improvement costs. We quickly obtain similar or better solutions compared to several intuitive analyses that take much longer to perform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Bauman; S. Burian; M. Deo
The Utah Heavy Oil Program (UHOP) was established in June 2006 to provide multidisciplinary research support to federal and state constituents for addressing the wide-ranging issues surrounding the creation of an industry for unconventional oil production in the United States. Additionally, UHOP was to serve as an on-going source of unbiased information to the nation surrounding technical, economic, legal and environmental aspects of developing heavy oil, oil sands, and oil shale resources. UHOP fulGilled its role by completing three tasks. First, in response to the Energy Policy Act of 2005 Section 369(p), UHOP published an update report to the 1987more » technical and economic assessment of domestic heavy oil resources that was prepared by the Interstate Oil and Gas Compact Commission. The UHOP report, entitled 'A Technical, Economic, and Legal Assessment of North American Heavy Oil, Oil Sands, and Oil Shale Resources' was published in electronic and hard copy form in October 2007. Second, UHOP developed of a comprehensive, publicly accessible online repository of unconventional oil resources in North America based on the DSpace software platform. An interactive map was also developed as a source of geospatial information and as a means to interact with the repository from a geospatial setting. All documents uploaded to the repository are fully searchable by author, title, and keywords. Third, UHOP sponsored Give research projects related to unconventional fuels development. Two projects looked at issues associated with oil shale production, including oil shale pyrolysis kinetics, resource heterogeneity, and reservoir simulation. One project evaluated in situ production from Utah oil sands. Another project focused on water availability and produced water treatments. The last project considered commercial oil shale leasing from a policy, environmental, and economic perspective.« less
Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A
2016-01-01
NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.
Thayer, Erin K.; Rathkey, Daniel; Miller, Marissa Fuqua; Palmer, Ryan; Mejicano, George C.; Pusic, Martin; Kalet, Adina; Gillespie, Colleen; Carney, Patricia A.
2016-01-01
Issue Medical educators and educational researchers continue to improve their processes for managing medical student and program evaluation data using sound ethical principles. This is becoming even more important as curricular innovations are occurring across undergraduate and graduate medical education. Dissemination of findings from this work is critical, and peer-reviewed journals often require an institutional review board (IRB) determination. Approach IRB data repositories, originally designed for the longitudinal study of biological specimens, can be applied to medical education research. The benefits of such an approach include obtaining expedited review for multiple related studies within a single IRB application and allowing for more flexibility when conducting complex longitudinal studies involving large datasets from multiple data sources and/or institutions. In this paper, we inform educators and educational researchers on our analysis of the use of the IRB data repository approach to manage ethical considerations as part of best practices for amassing, pooling, and sharing data for educational research, evaluation, and improvement purposes. Implications Fostering multi-institutional studies while following sound ethical principles in the study of medical education is needed, and the IRB data repository approach has many benefits, especially for longitudinal assessment of complex multi-site data. PMID:27443407
DOE Office of Scientific and Technical Information (OSTI.GOV)
Himmelberger, J.J.; Ogneva-Himmelberger, Y.A.; Baughman, M.
Whether the proposed Yucca Mountain nuclear waste repository system will adversely impact tourism in southern Nevada is an open question of particular importance to visitor-oriented rural counties bisected by planned waste transportation corridors (highway or rail). As part of one such county`s repository impact assessment program, tourism implications of Three Mile Island (TMI) and other major hazard events have been revisited to inform ongoing county-wide socioeconomic assessments and contingency planning efforts. This paper summarizes key research implications of such research as applied to Lincoln County, Nevada. Implications for other rural counties are discussed in light of the research findings. 29more » refs., 3 figs., 1 tab.« less
A clinical data repository enhances hospital infection control.
Samore, M.; Lichtenberg, D.; Saubermann, L.; Kawachi, C.; Carmeli, Y.
1997-01-01
We describe the benefits of a relational database of hospital clinical data (Clinical Data Repository; CDR) for an infection control program. The CDR consists of > 40 Sybase tables, and is directly accessible for ad hoc queries by members of the infection control unit who have been granted privileges for access by the Information Systems Department. The data elements and functional requirements most useful for surveillance of nosocomial infections, antibiotic use, and resistant organisms are characterized. Specific applications of the CDR are presented, including the use of automated definitions of nosocomial infection, graphical monitoring of resistant organisms with quality control limits, and prospective detection of inappropriate antibiotic use. Hospital surveillance and quality improvement activities are significantly benefited by the availability of a querable set of tables containing diverse clinical data. PMID:9357588
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Hyder, L.K.; Alley, P.D.
1988-01-01
Five shales were examined as part of the Sedimentary Rock Program evaluation of this medium as a potential host for a US civilian nuclear waste repository. The units selected for characterization were the Chattanooga Shale from Fentress County, Tennessee; the Pierre Shale from Gregory County, South Dakota; the Green River Formation from Garfield County, Colorado; and the Nolichucky Shale and Pumpkin Valley Shale from Roane County, Tennessee. The micromorphology and structure of the shales were examined by petrographic, scanning electron, and high-resolution transmission electron microscopy. Chemical and mineralogical compositions were studied through the use of energy-dispersive x-ray, neutron activation, atomicmore » absorption, thermal, and x-ray diffraction analysis techniques. 18 refs., 12 figs., 2 tabs.« less
Torres, Leticia; Hu, E.; Tiersch, Terrence R.
2017-01-01
Cryopreservation in aquatic species in general has been constrained to research activities for more than 60 years. Although the need for application and commercialisation pathways has become clear, the lack of comprehensive quality assurance and quality control programs has impeded the progress of the field, delaying the establishment of germplasm repositories and commercial-scale applications. In this review we focus on the opportunities for standardisation in the practices involved in the four main stages of the cryopreservation process: (1) source, housing and conditioning of fish; (2) sample collection and preparation; (3) freezing and cryogenic storage of samples; and (4) egg collection and use of thawed sperm samples. In addition, we introduce some key factors that would assist the transition to commercial-scale, high-throughput application. PMID:26739583
Kamal, Jyoti; Liu, Jianhua; Ostrander, Michael; Santangelo, Jennifer; Dyta, Ravi; Rogers, Patrick; Mekhjian, Hagop S.
2010-01-01
Since its inception in 1997, the IW (Information Warehouse) at the Ohio State University Medical Center (OSUMC) has gradually transformed itself from a single purpose business decision support system to a comprehensive informatics platform supporting basic, clinical, and translational research. The IW today is the combination of four integrated components: a clinical data repository containing over a million patients; a research data repository housing various research specific data; an application development platform for building business and research enabling applications; a business intelligence environment assisting in reporting in all function areas. The IW is structured and encoded using standard terminologies such as SNOMED-CT, ICD, and CPT. The IW is an important component of OSUMC’s Clinical and Translational Science Award (CTSA) informatics program. PMID:21347019
U. S. Geological Survey programs in Wisconsin
,
1996-01-01
The U.S. Geological Survey (USGS) has served as the Nation’s principal collector, repository, and interpreter of earth science data for more than a century. In this capacity, the USGS in Wisconsin works in partnership with State, county, municipal public works departments, public health agencies, water and sanitation districts, Indian agencies, and other Federal agencies. This Fact Sheet describes some of the current USGS activities in Wisconsin.
Report on International Collaboration Involving the FE Heater and HG-A Tests at Mont Terri
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houseworth, Jim; Rutqvist, Jonny; Asahina, Daisuke
Nuclear waste programs outside of the US have focused on different host rock types for geological disposal of high-level radioactive waste. Several countries, including France, Switzerland, Belgium, and Japan are exploring the possibility of waste disposal in shale and other clay-rich rock that fall within the general classification of argillaceous rock. This rock type is also of interest for the US program because the US has extensive sedimentary basins containing large deposits of argillaceous rock. LBNL, as part of the DOE-NE Used Fuel Disposition Campaign, is collaborating on some of the underground research laboratory (URL) activities at the Mont Terrimore » URL near Saint-Ursanne, Switzerland. The Mont Terri project, which began in 1995, has developed a URL at a depth of about 300 m in a stiff clay formation called the Opalinus Clay. Our current collaboration efforts include two test modeling activities for the FE heater test and the HG-A leak-off test. This report documents results concerning our current modeling of these field tests. The overall objectives of these activities include an improved understanding of and advanced relevant modeling capabilities for EDZ evolution in clay repositories and the associated coupled processes, and to develop a technical basis for the maximum allowable temperature for a clay repository.« less
The Galileo Teacher Training Programme
NASA Astrophysics Data System (ADS)
Doran, Rosa
The Galileo Teacher Training Program is a global effort to empower teachers all over the world to embark on a new trend in science teaching, using new technologies and real research meth-ods to teach curriculum content. The GTTP goal is to create a worldwide network of "Galileo Ambassadors", promoters of GTTP training session, and a legion of "Galileo Teachers", edu-cators engaged on the use of innovative resources and sharing experiences and supporting its pears worldwide. Through workshops, online training tools and resources, the products and techniques promoted by this program can be adapted to reach locations with few resources of their own, as well as network-connected areas that can take advantage of access to robotic, optical and radio telescopes, webcams, astronomy exercises, cross-disciplinary resources, image processing and digital universes (web and desktop planetariums). Promoters of GTTP are expert astronomy educators connected to Universities or EPO institutions that facilitate the consolidation of an active support to newcomers and act as a 24 hour helpdesk to teachers all over the world. GTTP will also engage in the creation of a repository of astronomy education resources and science research projects, ViRoS (Virtual Repository of resources and Science Projects), in order to simplify the task of educators willing to enrich classroom activities.
Examination of Data Accession at the National Snow and Ice Data Center
NASA Astrophysics Data System (ADS)
Scott, D. J.; Booker, L.
2017-12-01
The National Snow and Ice Data Center (NSIDC) stewards nearly 750 publicly available snow and ice data sets that support research into our world's frozen realms. NSIDC data management is primarily supported by the National Aeronautics and Space Administration (NASA), the National Science Foundation (NSF) and the National Oceanic and Atmospheric Administration (NOAA), and most of the data we archive and distribute is assigned to NSIDC through the funding agency programs. In addition to these mandates, NSIDC has historically offered data stewardship to researchers wanting to properly preserve and increase visibility of their research data under our primary programs (NASA, NSF, NOAA). With publishers now requiring researchers to deliver data to a repository prior to the publication of their data-related papers, we have seen an increase in researcher-initiated data accession requests. This increase is pushing us to reexamine our process to ensure timeliness in the acquisition and release of these data. In this presentation, we will discuss the support and value a researcher receives by submitting data to a trustworthy repository. We will examine NSIDC's data accession practices, and the challenges of a consistent process across NSIDC's multiple funding sponsors. Finally, we will share recent activities related to improving our process and ideas we have for enhancing the overall data accession experience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-10-01
This EIS analyzes the significant environmental impacts that could occur if various technologies for management and disposal of high-level and transuranic wastes from commercial nuclear power reactors were to be developed and implemented. This EIS will serve as the environmental input for the decision on which technology, or technologies, will be emphasized in further research and development activities in the commercial waste management program. The action proposed in this EIS is to (1) adopt a national strategy to develop mined geologic repositories for disposal of commercially generated high-level and transuranic radioactive waste (while continuing to examine subseabed and very deepmore » hole disposal as potential backup technologies) and (2) conduct a R and D program to develop such facilities and the necessary technology to ensure the safe long-term containment and isolation of these wastes. The Department has considered in this statement: development of conventionally mined deep geologic repositories for disposal of spent fuel from nuclear power reactors and/or radioactive fuel reprocessing wastes; balanced development of several alternative disposal methods; and no waste disposal action. This volume contains written public comments and hearing board responses and reports offered on the draft statement.« less
Benchmarking transportation logistics practices for effective system planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thrower, A.W.; Dravo, A.N.; Keister, M.
2007-07-01
This paper presents preliminary findings of an Office of Civilian Radioactive Waste Management (OCRWM) benchmarking project to identify best practices for logistics enterprises. The results will help OCRWM's Office of Logistics Management (OLM) design and implement a system to move spent nuclear fuel (SNF) and high-level radioactive waste (HLW) to the Yucca Mountain repository for disposal when that facility is licensed and built. This report suggests topics for additional study. The project team looked at three Federal radioactive material logistics operations that are widely viewed to be successful: (1) the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico; (2)more » the Naval Nuclear Propulsion Program (NNPP); and (3) domestic and foreign research reactor (FRR) SNF acceptance programs. (authors)« less
Wu, Tai-luan; Tseng, Ling-li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327
Cross-Cutting Risk Framework: Mining Data for Common Risks Across the Portfolio
NASA Technical Reports Server (NTRS)
Klein, Gerald A., Jr.; Ruark, Valerie
2017-01-01
The National Aeronautics and Space Administration (NASA) defines risk management as an integrated framework, combining risk-informed decision making and continuous risk management to foster forward-thinking and decision making from an integrated risk perspective. Therefore, decision makers must have access to risks outside of their own project to gain the knowledge that provides the integrated risk perspective. Through the Goddard Space Flight Center (GSFC) Flight Projects Directorate (FPD) Business Change Initiative (BCI), risks were integrated into one repository to facilitate access to risk data between projects. With the centralized repository, communications between the FPD, project managers, and risk managers improved and GSFC created the cross-cutting risk framework (CCRF) team. The creation of the consolidated risk repository, in parallel with the initiation of monthly FPD risk managers and risk governance board meetings, are now providing a complete risk management picture spanning the entire directorate. This paper will describe the challenges, methodologies, tools, and techniques used to develop the CCRF, and the lessons learned as the team collectively worked to identify risks that FPD programs projects had in common, both past and present.
Social and Emotional Learning and Academic Achievement in Portuguese Schools: A Bibliometric Study.
Cristóvão, Ana M; Candeias, Adelinda A; Verdasca, José
2017-01-01
Social and Emotional Learning (SEL) is an educational movement that is gaining ground throughout the world. We can define SEL as the capacity to recognize and manage emotions, solve problems effectively, and establish positive relationships with others. Research has demonstrated the significant role of SEL in promoting healthy student development and academic achievement. Extensive research confirms that SEL competencies: can be taught, that they promote positive development and reduce problem behaviors, and that they improve students' academic achievement and citizenship. At the international level, several rigorous studies have identified programs and practices that promote SEL. In Portugal, however, no review has yet been published regarding the implementation of SEL programs. Such a study would elucidate the current panorama of SEL programs in Portugal. This study aims to identify research on SEL programs implemented in Portuguese schools and the relationship of those programs with academic achievement. To this end, we have consulted the following databases: Scientific Repository of Open Access of Portugal (RCAAP), Online Knowledge Library (b-on), and Web of Science (WoS). The criteria were: (a) all time frames; (b) publications in either Portuguese or English; (c) programs that developed socio-emotional competencies in Portuguese schools; (d) academic levels including elementary, middle, and high school and (e) students of regular education. Few publications on SEL programs implemented in Portugal were found, although the recent decade has witnessed an upsurge of interest in the topic, principally that arising from academic research.
Social and Emotional Learning and Academic Achievement in Portuguese Schools: A Bibliometric Study
Cristóvão, Ana M.; Candeias, Adelinda A.; Verdasca, José
2017-01-01
Social and Emotional Learning (SEL) is an educational movement that is gaining ground throughout the world. We can define SEL as the capacity to recognize and manage emotions, solve problems effectively, and establish positive relationships with others. Research has demonstrated the significant role of SEL in promoting healthy student development and academic achievement. Extensive research confirms that SEL competencies: can be taught, that they promote positive development and reduce problem behaviors, and that they improve students' academic achievement and citizenship. At the international level, several rigorous studies have identified programs and practices that promote SEL. In Portugal, however, no review has yet been published regarding the implementation of SEL programs. Such a study would elucidate the current panorama of SEL programs in Portugal. This study aims to identify research on SEL programs implemented in Portuguese schools and the relationship of those programs with academic achievement. To this end, we have consulted the following databases: Scientific Repository of Open Access of Portugal (RCAAP), Online Knowledge Library (b-on), and Web of Science (WoS). The criteria were: (a) all time frames; (b) publications in either Portuguese or English; (c) programs that developed socio-emotional competencies in Portuguese schools; (d) academic levels including elementary, middle, and high school and (e) students of regular education. Few publications on SEL programs implemented in Portugal were found, although the recent decade has witnessed an upsurge of interest in the topic, principally that arising from academic research. PMID:29167650
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harwell, M. A.; Brandstetter, A.; Benson, G. L.
1982-06-01
As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harwell, M. A.; Brandstetter, A.; Benson, G. L.
1982-06-01
As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less
Where Will All Your Samples Go?
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to define standards that institutions must comply with to function as a trustworthy sample repository similar to trustworthy digital repositories. The iSamples Research Coordination Network of the EarthCube program aims to address some of these questions in workshops planned for 2018. This panel session offers an opportunity to ignite the discussion.
Clark county monitoring program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conway, Sheila; Auger, Jeremy; Navies, Irene
2007-07-01
Available in abstract form only. Full text of publication follows: Since 1988, Clark County has been one of the counties designated by the United States Department of Energy (DOE) as an 'Affected Unit of Local Government' (AULG). The AULG designation is an acknowledgement by the federal government that could be negatively impacted to a considerable degree by activities associated with the Yucca Mountain High Level Nuclear Waste Repository. These negative effects would have an impact on residents as individuals and the community as a whole. As an AULG, Clark County is authorized to identify 'any potential economic, social, public healthmore » and safety, and environmental impacts' of the potential repository (42 USC Section 10135(C)(1)(B)(1)). Toward this end, Clark County has conducted numerous studies of potential impacts, many of which are summarized in the Clark County's Impact Assessment Report that was submitted by the DOE and the president of the United States in February 2002. Given the unprecedented magnitude and duration of the DoE's proposal, as well as the many unanswered questions about the number of shipments and the modal mix, the estimate of impacts described in these studies are preliminary. In order to refine these estimates, Clark County Comprehensive Planning Department's Nuclear Waste Division is continuing to assess potential impacts. In addition, the County has implemented a Monitoring Program designed to capture changes to the social, environmental, and economic well-being of its residents resulting from the Yucca Mountain project and other significant events within the County. The Monitoring Program acts as an 'early warning system' that allows Clark County decision makers to proactive respond to impacts from the Yucca Mountain Project. (authors)« less
Newman, John H; Rich, Stuart; Abman, Steven H; Alexander, John H; Barnard, John; Beck, Gerald J; Benza, Raymond L; Bull, Todd M; Chan, Stephen Y; Chun, Hyung J; Doogan, Declan; Dupuis, Jocelyn; Erzurum, Serpil C; Frantz, Robert P; Geraci, Mark; Gillies, Hunter; Gladwin, Mark; Gray, Michael P; Hemnes, Anna R; Herbst, Roy S; Hernandez, Adrian F; Hill, Nicholas S; Horn, Evelyn M; Hunter, Kendall; Jing, Zhi-Cheng; Johns, Roger; Kaul, Sanjay; Kawut, Steven M; Lahm, Tim; Leopold, Jane A; Lewis, Greg D; Mathai, Stephen C; McLaughlin, Vallerie V; Michelakis, Evangelos D; Nathan, Steven D; Nichols, William; Page, Grier; Rabinovitch, Marlene; Rich, Jonathan; Rischard, Franz; Rounds, Sharon; Shah, Sanjiv J; Tapson, Victor F; Lowy, Naomi; Stockbridge, Norman; Weinmann, Gail; Xiao, Lei
2017-06-15
The Division of Lung Diseases of the NHLBI and the Cardiovascular Medical Education and Research Fund held a workshop to discuss how to leverage the anticipated scientific output from the recently launched "Redefining Pulmonary Hypertension through Pulmonary Vascular Disease Phenomics" (PVDOMICS) program to develop newer approaches to pulmonary vascular disease. PVDOMICS is a collaborative, protocol-driven network to analyze all patient populations with pulmonary hypertension to define novel pulmonary vascular disease (PVD) phenotypes. Stakeholders, including basic, translational, and clinical investigators; clinicians; patient advocacy organizations; regulatory agencies; and pharmaceutical industry experts, joined to discuss the application of precision medicine to PVD clinical trials. Recommendations were generated for discussion of research priorities in line with NHLBI Strategic Vision Goals that include: (1) A national effort, involving all the stakeholders, should seek to coordinate biosamples and biodata from all funded programs to a web-based repository so that information can be shared and correlated with other research projects. Example programs sponsored by NHLBI include PVDOMICS, Pulmonary Hypertension Breakthrough Initiative, the National Biological Sample and Data Repository for PAH, and the National Precision Medicine Initiative. (2) A task force to develop a master clinical trials protocol for PVD to apply precision medicine principles to future clinical trials. Specific features include: (a) adoption of smaller clinical trials that incorporate biomarker-guided enrichment strategies, using adaptive and innovative statistical designs; and (b) development of newer endpoints that reflect well-defined and clinically meaningful changes. (3) Development of updated and systematic variables in imaging, hemodynamic, cellular, genomic, and metabolic tests that will help precisely identify individual and shared features of PVD and serve as the basis of novel phenotypes for therapeutic interventions.
McHugh, Seamus Mark; Corrigan, Mark; Dimitrov, Borislav; Cowman, Seamus; Tierney, Sean; Humphreys, Hilary; Hill, Arnold
2010-01-01
Surgical site infection accounts for 20% of all health care-associated infections (HCAIs); however, a program incorporating the education of surgeons has yet to be established across the specialty. An audit of surgical practice in infection prevention was carried out in Beaumont Hospital from July to November 2009. An educational Web site was developed targeting deficiencies highlighted in the audit. Interactive clinical cases were constructed using PHP coding, an HTML-embedded language, and then linked to a MySQL relational database. PowerPoint tutorials were produced as online Flash audiovisual movies. An online repository of streaming videos demonstrating best practice was made available, and weekly podcasts were made available on the iTunes© store for free download. Usage of the e-learning program was assessed quantitatively over 6 weeks in May and June 2010 using the commercial company Hitslink. During the 5-month audit, deficiencies in practice were highlighted, including the timing of surgical prophylaxis (33% noncompliance) and intravascular catheter care in surgical patients (38% noncompliance regarding necessity). Over the 6-week assessment of the educational material, the SurgInfection.com Web pages were accessed more than 8000 times; 77.9% of the visitors were from Ireland. The most commonly accessed modality was the repository with interactive clinical cases, accounting for 3463 (43%) of the Web site visits. The average user spent 57 minutes per visit, with 30% of them visiting the Web site multiple times. Interactive virtual cases mirroring real-life clinical scenarios are likely to be successful as an e-learning modality. User-friendly interfaces and 24-hour accessibility will increases uptake by surgical trainees.
Extreme ground motions and Yucca Mountain
Hanks, Thomas C.; Abrahamson, Norman A.; Baker, Jack W.; Boore, David M.; Board, Mark; Brune, James N.; Cornell, C. Allin; Whitney, John W.
2013-01-01
Yucca Mountain is the designated site of the underground repository for the United States' high-level radioactive waste (HLW), consisting of commercial and military spent nuclear fuel, HLW derived from reprocessing of uranium and plutonium, surplus plutonium, and other nuclear-weapons materials. Yucca Mountain straddles the western boundary of the Nevada Test Site, where the United States has tested nuclear devices since the 1950s, and is situated in an arid, remote, and thinly populated region of Nevada, ~100 miles northwest of Las Vegas. Yucca Mountain was originally considered as a potential underground repository of HLW because of its thick units of unsaturated rocks, with the repository horizon being not only ~300 m above the water table but also ~300 m below the Yucca Mountain crest. The fundamental rationale for a geologic (underground) repository for HLW is to securely isolate these materials from the environment and its inhabitants to the greatest extent possible and for very long periods of time. Given the present climate conditions and what is known about the current hydrologic system and conditions around and in the mountain itself, one would anticipate that the rates of infiltration, corrosion, and transport would be very low—except for the possibility that repository integrity might be compromised by low-probability disruptive events, which include earthquakes, strong ground motion, and (or) a repository-piercing volcanic intrusion/eruption. Extreme ground motions (ExGM), as we use the phrase in this report, refer to the extremely large amplitudes of earthquake ground motion that arise at extremely low probabilities of exceedance (hazard). They first came to our attention when the 1998 probabilistic seismic hazard analysis for Yucca Mountain was extended to a hazard level of 10-8/yr (a 10-4/yr probability for a 104-year repository “lifetime”). The primary purpose of this report is to summarize the principal results of the ExGM research program as they have developed over the past 5 years; what follows will be focused on Yucca Mountain, but not restricted to it.
CMR Catalog Service for the Web
NASA Technical Reports Server (NTRS)
Newman, Doug; Mitchell, Andrew
2016-01-01
With the impending retirement of Global Change Master Directory (GCMD) Application Programming Interfaces (APIs) the Common Metadata Repository (CMR) was charged with providing a collection-level Catalog Service for the Web (CSW) that provided the same level of functionality as GCMD. This talk describes the capabilities of the CMR CSW API with particular reference to the support of the Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS) Integrated Catalog (CWIC).
2016-06-01
of technology and near-global Internet accessibility, a web -based program incorporating interactive maps to record personal combat experiences does...not exist. The Combat Stories Map addresses this deficiency. The Combat Stories Map is a web -based Geographic Information System specifically designed...iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Despite the proliferation of technology and near-global Internet accessibility, a web
Wang, Dongwen; Luque, Amneris E
2016-01-01
The New York State HIV-HCV-STD Clinical Education Initiative (CEI) has developed a large repository of online resources and disseminated them to a wide range of healthcare providers. To evaluate the CEI online education program and in particular to compare the self-reported measures by clinicians from different disciplines, we analyzed the data from 1,558 course completions in a study period of three months. The results have shown that the overall evaluations by the clinicians were very positive. Meanwhile, there were significant differences across the clinical disciplines. In particular, physicians and nurse practitioners were the most satisfied. In contrast, pharmacists and case/care managers recorded lower than average responses. Nurses and counselors had mixed results. Nurse practitioners' responses were very similar to physicians on most measures, but significantly different from nurses in many aspects. For more effective knowledge dissemination, online education programs should consider the unique needs by clinicians from specific disciplines.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
An overview of tools for the validation of protein NMR structures.
Vuister, Geerten W; Fogh, Rasmus H; Hendrickx, Pieter M S; Doreleijers, Jurgen F; Gutmanas, Aleksandras
2014-04-01
Biomolecular structures at atomic resolution present a valuable resource for the understanding of biology. NMR spectroscopy accounts for 11% of all structures in the PDB repository. In response to serious problems with the accuracy of some of the NMR-derived structures and in order to facilitate proper analysis of the experimental models, a number of program suites are available. We discuss nine of these tools in this review: PROCHECK-NMR, PSVS, GLM-RMSD, CING, Molprobity, Vivaldi, ResProx, NMR constraints analyzer and QMEAN. We evaluate these programs for their ability to assess the structural quality, restraints and their violations, chemical shifts, peaks and the handling of multi-model NMR ensembles. We document both the input required by the programs and output they generate. To discuss their relative merits we have applied the tools to two representative examples from the PDB: a small, globular monomeric protein (Staphylococcal nuclease from S. aureus, PDB entry 2kq3) and a small, symmetric homodimeric protein (a region of human myosin-X, PDB entry 2lw9).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, M; Blink, J A; Greenberg, H R
2012-04-25
The Used Fuel Disposition (UFD) Campaign within the Department of Energy's Office of Nuclear Energy (DOE-NE) Fuel Cycle Technology (FCT) program has been tasked with investigating the disposal of the nation's spent nuclear fuel (SNF) and high-level nuclear waste (HLW) for a range of potential waste forms and geologic environments. The planning, construction, and operation of a nuclear disposal facility is a long-term process that involves engineered barriers that are tailored to both the geologic environment and the waste forms being emplaced. The UFD Campaign is considering a range of fuel cycles that in turn produce a range of wastemore » forms. The UFD Campaign is also considering a range of geologic media. These ranges could be thought of as adding uncertainty to what the disposal facility design will ultimately be; however, it may be preferable to thinking about the ranges as adding flexibility to design of a disposal facility. For example, as the overall DOE-NE program and industrial actions result in the fuel cycles that will produce waste to be disposed, and the characteristics of those wastes become clear, the disposal program retains flexibility in both the choice of geologic environment and the specific repository design. Of course, other factors also play a major role, including local and State-level acceptance of the specific site that provides the geologic environment. In contrast, the Yucca Mountain Project (YMP) repository license application (LA) is based on waste forms from an open fuel cycle (PWR and BWR assemblies from an open fuel cycle). These waste forms were about 90% of the total waste, and they were the determining waste form in developing the engineered barrier system (EBS) design for the Yucca Mountain Repository design. About 10% of the repository capacity was reserved for waste from a full recycle fuel cycle in which some actinides were extracted for weapons use, and the remaining fission products and some minor actinides were encapsulated in borosilicate glass. Because the heat load of the glass was much less than the PWR and BWR assemblies, the glass waste form was able to be co-disposed with the open cycle waste, by interspersing glass waste packages among the spent fuel assembly waste packages. In addition, the Yucca Mountain repository was designed to include some research reactor spent fuel and naval reactor spent fuel, within the envelope that was set using the commercial reactor assemblies as the design basis waste form. This milestone report supports Sandia National Laboratory milestone M2FT-12SN0814052, and is intended to be a chapter in that milestone report. The independent technical review of this LLNL milestone was performed at LLNL and is documented in the electronic Information Management (IM) system at LLNL. The objective of this work is to investigate what aspects of quantifying, characterizing, and representing the uncertainty associated with the engineered barrier are affected by implementing different advanced nuclear fuel cycles (e.g., partitioning and transmutation scenarios) together with corresponding designs and thermal constraints.« less
Whitney, J.W.; Keefer, W.R.
2000-01-01
In recognition of a critical national need for permanent radioactive-waste storage, Yucca Mountain in southwestern Nevada has been investigated by Federal agencies since the 1970's, as a potential geologic disposal site. In 1987, Congress selected Yucca Mountain for an expanded and more detailed site characterization effort. As an integral part of this program, the U.S. Geological Survey began a series of detailed geologic, geophysical, and related investigations designed to characterize the tectonic setting, fault behavior, and seismicity of the Yucca Mountain area. This document presents the results of 13 studies of the tectonic environment of Yucca Mountain, in support of a broad goal to assess the effects of future seismic and fault activity in the area on design, long-term performance, and safe operation of the potential surface and subsurface repository facilities.
Rural migration in Nevada: Lincoln County. Phase 1, 1992--1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soden, D.L.; Carns, D.E.; Mosser, D.
1993-12-31
The principal objective of this project was to develop insight into the scope of migration of working age Nevadans out of their county of birth; including the collection of data on their skill levels, desire to out or in-migrate, interactions between families of migratory persons, and the impact that the proposed high-level nuclear waste repository at Yucca mountain might have on their individual, and collective, decisions to migrate and return. The initial phase of this project reported here was conducted in 1992 and 1993 in Lincoln County, Nevada, one of the counties designated as ``affected`` by the proposed repository program.more » The findings suggest that a serious out-migration problem exists in Lincoln County, and that the Yucca mountain project will likely affect decisions relating to migration patterns in the future.« less
The siting program of geological repository for spent fuel/high-level waste in Czech Republic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novotny, P.
1993-12-31
The management of high-level waste in Czech Republic have a very short history, because before the year 1989 spent nuclear fuel was re-exported back to USSR. The project ``Geological research of HLW repository in Czech Republic`` was initiated during 1990 by the Ministry of the Environment of the Czech Republic and by this project delegated the Czech Geological Survey (CGU) Prague. The first CGU project late in 1990 for multibarrier concept has proposed a geological repository to be located at a depth of about 500 m. Screening and studies of potential sites for repository started in 1991. First stage representedmore » regional siting of the Czech Republic for perspective rock types and massifs. In cooperation with GEOPHYSICS Co., Geophysical Institute of the Czech Academy of Sciences and Charles University Prague 27 perspective regions were selected, using criteria IAEA. This work in the Czech Republic was possible thanks to the detailed geological studies done in the past and thanks to the numerous archive data, concentrated in the central geological archive GEOFOND. Selection of perspective sites also respected natural conservation regions, regions conserving water and mineral waters resources. CGU opened up contact with countries with similar geological situation and started cooperation with SKB (Swedish Nuclear Fuel and Waste Management Co.). The Project of geological research for the next 10 years is a result of these activities.« less
Investigating the Thermal Limit of Clay Minerals for Applications in Nuclear Waste Repository Design
NASA Astrophysics Data System (ADS)
Matteo, E. N.; Miller, A. W.; Kruichak, J.; Mills, M.; Tellez, H.; Wang, Y.
2013-12-01
Clay minerals are likely candidates to aid in nuclear waste isolation due to their low permeability, favorable swelling properties, and high cation sorption capacities. Establishing the thermal limit for clay minerals in a nuclear waste repository is a potentially important component of repository design, as flexibility of the heat load within the repository can have a major impact on the selection of repository design. For example, the thermal limit plays a critical role in the time that waste packages would need to cool before being transferred to the repository. Understanding the chemical and physical changes that occur in clay minerals at various temperatures above the current thermal limit (of 100 °C) can enable decision-makers with information critical to evaluating the potential trade-offs of increasing the thermal limit within the repository. Most critical is gaining understanding of how varying thermal conditions in the repository will impact radionuclide sorption and transport in clay materials either as engineered barriers or as disposal media. A variety of clays (illite, mixed layer illite/smectite, montmorillonite, and palygorskite) were heated for a range of temperatures between 100-500 °C. These samples were characterized by a variety of methods, including nitrogen adsorption, x-ray diffraction, thermogravimetric analysis, barium chloride exchange for cation exchange capacity (CEC), and iodide sorption. The nitrogen porosimetry shows that for all the clays, thermally-induced changes in BET surface area are dominated by collapse/creation of the microporosity, i.e. pore diameters < 17 angstroms. Changes in micro porosity (relative to no heat treatment) are most significant for heat treatments 300 °C and above. Alterations are also seen in the chemical properties (CEC, XRD, iodide sorption) of clays, and like pore size distribution changes, are most significant above 300 °C. Overall, the results imply that changes seen in pores size distribution correlate with cation exchange capacity and cation exchange processes. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's Nation Nuclear Security Administration under contract DE-AC04-94AL85000. SAND Number: 2013-6352A.
Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R.; Ellis, Heidi JC; Hinman, M. Lee; Vyas, Jay; Gryk, Michael R.
2012-01-01
Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org). PMID:25328913
Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R; Ellis, Heidi Jc; Hinman, M Lee; Vyas, Jay; Gryk, Michael R
2012-01-01
Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marietta, Melvin Gary; Anderson, D. Richard; Bonano, Evaristo J.
2011-11-01
Sandia National Laboratories (SNL) is the world leader in the development of the detailed science underpinning the application of a probabilistic risk assessment methodology, referred to in this report as performance assessment (PA), for (1) understanding and forecasting the long-term behavior of a radioactive waste disposal system, (2) estimating the ability of the disposal system and its various components to isolate the waste, (3) developing regulations, (4) implementing programs to estimate the safety that the system can afford to individuals and to the environment, and (5) demonstrating compliance with the attendant regulatory requirements. This report documents the evolution of themore » SNL PA methodology from inception in the mid-1970s, summarizing major SNL PA applications including: the Subseabed Disposal Project PAs for high-level radioactive waste; the Waste Isolation Pilot Plant PAs for disposal of defense transuranic waste; the Yucca Mountain Project total system PAs for deep geologic disposal of spent nuclear fuel and high-level radioactive waste; PAs for the Greater Confinement Borehole Disposal boreholes at the Nevada National Security Site; and PA evaluations for disposal of high-level wastes and Department of Energy spent nuclear fuels stored at Idaho National Laboratory. In addition, the report summarizes smaller PA programs for long-term cover systems implemented for the Monticello, Utah, mill-tailings repository; a PA for the SNL Mixed Waste Landfill in support of environmental restoration; PA support for radioactive waste management efforts in Egypt, Iraq, and Taiwan; and, most recently, PAs for analysis of alternative high-level radioactive waste disposal strategies including repositories deep borehole disposal and geologic repositories in shale and granite. Finally, this report summarizes the extension of the PA methodology for radioactive waste disposal toward development of an enhanced PA system for carbon sequestration and storage systems. These efforts have produced a generic PA methodology for the evaluation of waste management systems that has gained wide acceptance within the international community. This report documents how this methodology has been used as an effective management tool to evaluate different disposal designs and sites; inform development of regulatory requirements; identify, prioritize, and guide research aimed at reducing uncertainties for objective estimations of risk; and support safety assessments.« less
Best Practices in NASA's Astrophysics Education and Public Outreach Projects
NASA Astrophysics Data System (ADS)
Hasan, H.; Smith, D.
2015-11-01
NASA's Astrophysics Education and Public Outreach (EPO) program has partnered scientists and educators since its inception almost twenty years ago, leading to authentic STEM experiences and products widely used by the education and outreach community. We present examples of best practices and representative projects. Keys to success include effective use of unique mission science/technology, attention to audience needs, coordination of effort, robust partnerships and publicly accessible repositories of EPO products. Projects are broadly targeted towards audiences in formal education, informal education, and community engagement. All NASA programs are evaluated for quality and impact. New technology is incorporated to engage young students being raised in the digital age. All projects focus on conveying the excitement of scientific discoveries from NASA's Astrophysics missions, advancing scientific literacy, and engaging students in science and technology careers.
Recent technology products from Space Human Factors research
NASA Technical Reports Server (NTRS)
Jenkins, James P.
1991-01-01
The goals of the NASA Space Human Factors program and the research carried out concerning human factors are discussed with emphasis given to the development of human performance models, data, and tools. The major products from this program are described, which include the Laser Anthropometric Mapping System; a model of the human body for evaluating the kinematics and dynamics of human motion and strength in microgravity environment; an operational experience data base for verifying and validating the data repository of manned space flights; the Operational Experience Database Taxonomy; and a human-computer interaction laboratory whose products are the display softaware and requirements and the guideline documents and standards for applications on human-computer interaction. Special attention is given to the 'Convoltron', a prototype version of a signal processor for synthesizing the head-related transfer functions.
Project characteristics monitoring report: BWIP (Basalt Waste Isolation Program) repository project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedli, E.A.; Herborn, D.I.; Taylor, C.D.
1988-03-01
This monitoring report has been prepared to show compliance with provisions of the Nuclear Waste Policy Act of 1982 (NWPA) and to provide local and state government agencies with information concerning the Basalt Waste Isolation Program (BWIP). This report contains data for the time period May 26, 1986 to February 1988. The data include employment figures, salaries, project purchases, taxes and fees paid, worker survey results, and project closedown personal interview summaries. This information has become particularly important since the decision in December 1987 to stop all BWIP activities except those for site reclamation. The Nuclear Waste Policy Amendments Actmore » of 1987 requires nonreclamation work at the Hanford Site to stop as of March 22, 1988. 7 refs., 6 figs., 28 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malone, C.R.
1995-09-01
The US Department of Energy (DOE) is proposing to develop a geologic repository for disposing of high-level nuclear waste at Yucca Mountain, Nevada. In this commentary, the ecology program for the DOE`s Yucca Mountain Project is discussed from the perspective of state-of-the-art ecosystem analysis, environmental ethics, and standards of professional practice. Specifically at issue is the need by the Yucca Mountain ecology program to adopt an ecosystem approach that encompasses the current strategy based on population biology and community ecology alone. The premise here is that an ecosystem approach is essential for assessing the long-term potential environmental impacts at Yuccamore » Mountain in light of the thermal effects expected to be associated with heat from radioactive decay.« less
NASA Astrophysics Data System (ADS)
Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.
2009-12-01
The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site characterization programs. The field investigation started in 2008, by selecting the Wildcat Fault that cut across the Laurence Berkeley National Laboratory (LBNL) site as the target. Analyses on site-specific data, surface geophysics, geological mapping and trenching have confirmed the approximate location and characteristics of the fault (see Session H48, Onishi, et al). The plan for the remaining years includes borehole investigations at LBNL, and another series of investigations in the northern part of the Wildcat Fault.
Generic repository design concepts and thermal analysis (FY11).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard, Robert; Dupont, Mark; Blink, James A.
2011-08-01
Reference concepts for geologic disposal of used nuclear fuel and high-level radioactive waste in the U.S. are developed, including geologic settings and engineered barriers. Repository thermal analysis is demonstrated for a range of waste types from projected future, advanced nuclear fuel cycles. The results show significant differences among geologic media considered (clay/shale, crystalline rock, salt), and also that waste package size and waste loading must be limited to meet targeted maximum temperature values. In this study, the UFD R&D Campaign has developed a set of reference geologic disposal concepts for a range of waste types that could potentially be generatedmore » in advanced nuclear FCs. A disposal concept consists of three components: waste inventory, geologic setting, and concept of operations. Mature repository concepts have been developed in other countries for disposal of spent LWR fuel and HLW from reprocessing UNF, and these serve as starting points for developing this set. Additional design details and EBS concepts will be considered as the reference disposal concepts evolve. The waste inventory considered in this study includes: (1) direct disposal of SNF from the LWR fleet, including Gen III+ advanced LWRs being developed through the Nuclear Power 2010 Program, operating in a once-through cycle; (2) waste generated from reprocessing of LWR UOX UNF to recover U and Pu, and subsequent direct disposal of used Pu-MOX fuel (also used in LWRs) in a modified-open cycle; and (3) waste generated by continuous recycling of metal fuel from fast reactors operating in a TRU burner configuration, with additional TRU material input supplied from reprocessing of LWR UOX fuel. The geologic setting provides the natural barriers, and establishes the boundary conditions for performance of engineered barriers. The composition and physical properties of the host medium dictate design and construction approaches, and determine hydrologic and thermal responses of the disposal system. Clay/shale, salt, and crystalline rock media are selected as the basis for reference mined geologic disposal concepts in this study, consistent with advanced international repository programs, and previous investigations in the U.S. The U.S. pursued deep geologic disposal programs in crystalline rock, shale, salt, and volcanic rock in the years leading up to the Nuclear Waste Policy Act, or NWPA (Rechard et al. 2011). The 1987 NWPA amendment act focused the U.S. program on unsaturated, volcanic rock at the Yucca Mountain site, culminating in the 2008 license application. Additional work on unsaturated, crystalline rock settings (e.g., volcanic tuff) is not required to support this generic study. Reference disposal concepts are selected for the media listed above and for deep borehole disposal, drawing from recent work in the U.S. and internationally. The main features of the repository concepts are discussed in Section 4.5 and summarized in Table ES-1. Temperature histories at the waste package surface and a specified distance into the host rock are calculated for combinations of waste types and reference disposal concepts, specifying waste package emplacement modes. Target maximum waste package surface temperatures are identified, enabling a sensitivity study to inform the tradeoff between the quantity of waste per disposal package, and decay storage duration, with respect to peak temperature at the waste package surface. For surface storage duration on the order of 100 years or less, waste package sizes for direct disposal of SNF are effectively limited to 4-PWR configurations (or equivalent size and output). Thermal results are summarized, along with recommendations for follow-on work including adding additional reference concepts, verification and uncertainty analysis for thermal calculations, developing descriptions of surface facilities and other system details, and cost estimation to support system-level evaluations.« less
Getting Beyond Yucca Mountain - 12305
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halstead, Robert J.; Williams, James M.
2012-07-01
The U.S. Department of Energy has terminated the Yucca Mountain repository project. The U.S. Nuclear Regulatory Commission has indefinitely suspended the Yucca Mountain licensing proceeding. The presidentially-appointed Blue Ribbon Commission (BRC) on America's Nuclear Future is preparing a report, due in January 2012, to the Secretary of Energy on recommendations for a new national nuclear waste management and disposal program. The BRC Draft Report published in July 2011 provides a compelling critique of the past three decades failed efforts in the United States to site storage and disposal facilities for spent nuclear fuel (SNF) and high-level radioactive waste (HLW). However,more » the BRC Draft Report fails to provide detailed guidance on how to implement an alternative, successful approach to facility site selection. The comments submitted to the BRC by the State of Nevada Agency for Nuclear Projects provide useful details on how the US national nuclear waste program can get beyond the failed Yucca Mountain repository project. A detailed siting process, consisting of legislative elements, procedural elements, and 'rules' for volunteer sites, could meet the objectives of the BRC and the Western Governors Association (WGA), while promoting and protecting the interests of potential host states. The recent termination of the proposed Yucca Mountain repository provides both an opportunity and a need to re-examine the United States' nuclear waste management program. The BRC Draft Report published in July 2011 provides a compelling critique of the past three decades failed efforts in the United States to site storage and disposal facilities for SNF and HLW. It is anticipated that the BRC Final report in January 2012 will recommend a new general course of action, but there will likely continue to be a need for detailed guidance on how to implement an alternative, successful approach to facility site selection. Getting the nation's nuclear waste program back on track requires, among other things, new principles for siting-principles based on partnership between the federal implementing agency and prospective host states. These principles apply to the task of developing an integrated waste management strategy, to interactions between the federal government and prospective host states for consolidated storage and disposal facilities, and to the logistically and politically complicated task of transportation system design. Lessons from the past 25 years, in combination with fundamental parameters of the nuclear waste management task in the US, suggest new principles for partnership outlined in this paper. These principles will work better if well-grounded and firm guidelines are set out beforehand and if the challenge of maintaining competence, transparency and integrity in the new organization is treated as a problem to be addressed rather than a result to be expected. (authors)« less
Telecommunications Network Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1989-05-01
The Office of Civilian Radioactive Waste Management (OCRWM) must, among other things, be equipped to readily produce, file, store, access, retrieve, and transfer a wide variety of technical and institutional data and information. The data and information regularly produced by members of the OCRWM Program supports, and will continue to support, a wide range of program activities. Some of the more important of these information communication-related activities include: supporting the preparation, submittal, and review of a license application to the Nuclear Regulatory Commission (NRC) to authorize the construction of a geologic repository; responding to requests for information from parties affectedmore » by and/or interested in the program; and providing evidence of compliance with all relevant Federal, State, local, and Indian Tribe regulations, statutes, and/or treaties. The OCRWM Telecommunications Network Plan (TNP) is intended to identify, as well as to present the current strategy for satisfying, the telecommunications requirements of the civilian radioactive waste management program. The TNP will set forth the plan for integrating OCRWM`s information resources among major program sites. Specifically, this plan will introduce a telecommunications network designed to establish communication linkages across the program`s Washington, DC; Chicago, Illinois; and Las Vegas, Nevada, sites. The linkages across these and associated sites will comprise Phase I of the proposed OCRWM telecommunications network. The second phase will focus on the modification and expansion of the Phase I network to fully accommodate access to the OCRWM Licensing Support System (LSS). The primary components of the proposed OCRWM telecommunications network include local area networks; extended local area networks; and remote extended (wide) area networks. 10 refs., 6 figs.« less
2010-04-01
isotopes. Laboratory analysis for general chemistry included Na, Ca, Mg, K, Fe, Cl, HCO3, CO3 , SO4, F, B, NO3, arsenic (As), hardness, alkalinity...used for interpretations within the project. Prior to this effort, a single -location repository for isotopic data related to IWV investigations...canyons of importance to this study (Indian Wells Canyon, Freeman Canyon, and the upgradient canyons of Cow Haven, Sage, and Horse). Single samples
Configuration Analysis Tool (CAT). System Description and users guide (revision 1)
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.
1982-01-01
A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.
User-Computer Interactions: Some Problems for Human Factors Research
1981-09-01
accessibility from the work place or home of R. information stored in major repositories. o Two-way real-time communication between broadcasting - facilities...Miller, and R.W. Pew (BBN Inc.) MDA 903-80-C-0551 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMBERS...average U.S. home has gone from about 10 in 1940 to about 100 in 1960 to a few thousand in 1930. Collectively, these trends represent an enormous
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robitz, E.S. Jr.; McAninch, M.D. Jr.; Edmonds, D.P.
1990-09-01
This report summarizes Phase 1 activities for closure development of the high-level nuclear waste package task for the tuff repository. Work was conducted under U.S. Department of Energy (DOE) Contract 9172105, administered through the Lawrence Livermore National Laboratory (LLNL), as part of the Yucca Mountain Project (YMP), funded through the DOE Office of Civilian Radioactive Waste Management (OCRWM). The goal of this phase was to select five closure processes for further evaluation in later phases of the program. A decision tree methodology was utilized to perform an objective evaluation of 15 potential closure processes. Information was gathered via a literaturemore » survey, industrial contacts, and discussions with project team members, other experts in the field, and the LLNL waste package task staff. The five processes selected were friction welding, electron beam welding, laser beam welding, gas tungsten arc welding, and plasma arc welding. These are felt to represent the best combination of weldment material properties and process performance in a remote, radioactive environment. Conceptual designs have been generated for these processes to illustrate how they would be implemented in practice. Homopolar resistance welding was included in the Phase 1 analysis, and developments in this process will be monitored via literature in Phases 2 and 3. Work was conducted in accordance with the YMP Quality Assurance Program. 223 refs., 20 figs., 9 tabs.« less
Wei, Yifeng; Kutcher, Stan; LeBlanc, John C.
2015-01-01
Introduction: Youth suicide is highly related to mental disorders. While communities and schools are marketed to with a plethora of suicide prevention programs, they often lack the capacity to choose evidence-based programs. Methods: We conducted a systematic review of two youth suicide prevention programs to help determine if the quality of evidence available justifies their wide spread dissemination. We searched Medline, PsycINFO, EMBASE, CINAHL, the Cochrane Library, Campbell Collaboration SPECTR database, SocIndex, Sociological Abstracts, Social Services Abstracts, ERIC, Social Work Abstracts, Research Library, and Web of Science, for relevant studies. We included studies/systematic reviews/meta-analysis that evaluated the effectiveness, cost-effectiveness, and/or safety of Signs of Suicide (SOS) and Yellow Ribbon (YR) suicide prevention programs that target adolescents. We applied the Office of Justice Program What Works Repository (OJP-R) to evaluate the quality of the included studies as effective, effective with reservation, promising, inconclusive evidence, insufficient evidence, and ineffective. Two SOS studies were ranked as “inconclusive evidence” based on the OJP-R. One SOS study was ranked as having “insufficient evidence” on OJP-R. The YR study was ranked as “ineffective” using OJP-R. We only included studies in peer-reviewed journals in English and therefore may have missed reports in grey literature or non-English publications. Results: We cannot recommend that schools and communities implement either the SOS or YR suicide prevention programs. Purchasers of these programs should be aware that there is no evidence that their use prevents suicide. Conclusions: Academics and organizations should not overstate the positive impacts of suicide prevention interventions when the evidence is lacking. PMID:26336375
Site characterization report for the basalt waste isolation project. Volume II
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1982-11-01
The reference location for a repository in basalt for the terminal storage of nuclear wastes on the Hanford Site and the candidate horizons within this reference repository location have been identified and the preliminary characterization work in support of the site screening process has been completed. Fifteen technical questions regarding the qualification of the site were identified to be addressed during the detailed site characterization phase of the US Department of Energy-National Waste Terminal Storage Program site selection process. Resolution of these questions will be provided in the final site characterization progress report, currently planned to be issued in 1987,more » and in the safety analysis report to be submitted with the License Application. The additional information needed to resolve these questions and the plans for obtaining the information have been identified. This Site Characterization Report documents the results of the site screening process, the preliminary site characterization data, the technical issues that need to be addressed, and the plans for resolving these issues. Volume 2 contains chapters 6 through 12: geochemistry; surface hydrology; climatology, meteorology, and air quality; environmental, land-use, and socioeconomic characteristics; repository design; waste package; and performance assessment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Heui-Joo; Lee, Jong Youl; Choi, Jongwon
2007-07-01
The development of a Korean Reference disposal System for the spent fuels from PWR and CANDU reactors is outlined in this paper. Around 36,000 tU of spent fuels are being projected based on the lifetimes of 28 nuclear power reactors in Korea. Since the site for the geological disposal has not yet been decided, a hypothetical site with representative Korean geologic conditions is proposed for the conceptual design of the repository. The disposal rates of the spent fuels are determined according to the total operation time of 55 years. The canisters are optimized by considering natural Korean conditions, and themore » buffer is designed with domestic Ca-bentonite. The depth of the repository is determined to be 500 m below the ground's surface. The canister separation distances are determined through a thermal analysis. The main features of the repository are presented from the layout to the closure. A computer program has been developed to calculate and analyze the volume and the area of the disposal system to help in the cost analysis. The final output of the design is presented as a unit disposal cost, US $315 /kgU. (authors)« less
Geotechnical support and topical studies for nuclear waste geologic repositories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1989-01-01
The present report lists the technical reviews and comments made during the fiscal year 1988 and summarizes the technical progress of the topical studies. In the area of technical assistance, there were numerous activities detailed in the next section. These included 24 geotechnical support activities, including reviews of 6 Study Plans (SP) and participation in 6 SP Review Workshops, review of one whole document Site Characterization Plan (SCP) and participation in the Assembled Document SCP Review Workshops by 6 LBL reviewers; the hosting of a DOE program review, the rewriting of the project statement of work, 2 trips to technicalmore » and planning meetings; preparation of proposed work statements for two new topics for DOE, and 5 instances of technical assistance to DOE. These activities are described in a Table in the following section entitled Geoscience Technical Support for Nuclear Waste Geologic Repositories.''« less
The NIH BD2K center for big data in translational genomics
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; James Kent, W; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van’t Veer, Laura; Haussler, David
2015-01-01
The world’s genomics data will never be stored in a single repository – rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world’s genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM’s performance and utility. PMID:26174866
The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier
NASA Technical Reports Server (NTRS)
Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.
2017-01-01
The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.
Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF
NASA Technical Reports Server (NTRS)
Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.
2001-01-01
The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.
Consolidated Storage Facilities: Camel's Nose or Shared Burden? - 13112
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, James M.
2013-07-01
The Blue Ribbon Commission (BRC) made a strong argument why the reformulated nuclear waste program should make prompt efforts to develop one or more consolidated storage facilities (CSFs), and recommended the amendment of NWPA Section 145(b) 2 (linking 'monitored retrievable storage' to repository development) as an essential means to that end. However, other than recommending that the siting of CSFs should be 'consent-based' and that spent nuclear fuel (SNF) at stranded sites should be first-in-line for removal, the Commission made few recommendations regarding how CSF development should proceed. Working with three other key Senators, Jeff Bingaman attempted in the 112.more » Congress to craft legislation (S. 3469) to put the BRC recommendations into legislative language. The key reason why the Nuclear Waste Administration Act of 2012 did not proceed was the inability of the four senators to agree on whether and how to amend NWPA Section 145(b). A brief review of efforts to site consolidated storage since the Nuclear Waste Policy Amendments Act of 1987 suggests a strong and consistent motivation to shift the burden to someone (anyone) else. This paper argues that modification of NWPA Section 145(b) should be accompanied by guidelines for regional development and operation of CSFs. After review of the BRC recommendations regarding CSFs, and the 'camel's nose' prospects if implementation is not accompanied by further guidelines, the paper outlines a proposal for implementation of CSFs on a regional basis, including priorities for removal from reactor sites and subsequently from CSFs to repositories. Rather than allowing repository siting to be prejudiced by the location of a single remote CSF, the regional approach limits transport for off-site acceptance and storage, increases the efficiency of removal operations, provides a useful basis for compensation to states and communities that accept CSFs, and gives states with shared circumstances a shared stake in storage and disposal in an integrated national program. (authors)« less
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
PAZAR: a framework for collection and dissemination of cis-regulatory sequence annotation
Portales-Casamar, Elodie; Kirov, Stefan; Lim, Jonathan; Lithwick, Stuart; Swanson, Magdalena I; Ticoll, Amy; Snoddy, Jay; Wasserman, Wyeth W
2007-01-01
PAZAR is an open-access and open-source database of transcription factor and regulatory sequence annotation with associated web interface and programming tools for data submission and extraction. Curated boutique data collections can be maintained and disseminated through the unified schema of the mall-like PAZAR repository. The Pleiades Promoter Project collection of brain-linked regulatory sequences is introduced to demonstrate the depth of annotation possible within PAZAR. PAZAR, located at , is open for business. PMID:17916232
PAZAR: a framework for collection and dissemination of cis-regulatory sequence annotation.
Portales-Casamar, Elodie; Kirov, Stefan; Lim, Jonathan; Lithwick, Stuart; Swanson, Magdalena I; Ticoll, Amy; Snoddy, Jay; Wasserman, Wyeth W
2007-01-01
PAZAR is an open-access and open-source database of transcription factor and regulatory sequence annotation with associated web interface and programming tools for data submission and extraction. Curated boutique data collections can be maintained and disseminated through the unified schema of the mall-like PAZAR repository. The Pleiades Promoter Project collection of brain-linked regulatory sequences is introduced to demonstrate the depth of annotation possible within PAZAR. PAZAR, located at http://www.pazar.info, is open for business.
Depleted Uranium Program: Repository and Chemical Analysis of Biological Samples
2010-11-01
Chemical Samples • Chemical Pathology and Analytical Assessment of U and DU in: • Tissues • Urine • Whole blood • Semen • Embedded fragments...preparation for determination of total uranium and isotopic uranium ratios Semen – Total Uranium – dry ashed by concentrated nitric acid in muffle...Total uranium and DU measurements in blood 0.0 50.0 100.0 150.0 200.0 250.0 ng U in s am pl e Sample Number Semen Measured U Theortical U Uranium
ASV3 dial-in interface recommendation for the Repository Based Software Engineering (RBSE) program
NASA Technical Reports Server (NTRS)
1992-01-01
The purpose of this report is to provide insight into the approach and design of the Cooperative User Interface (CUI). The CUI is being developed based on Hypercard technology and will provide the same look and feel as is provided by the NASA Electronic Library System (NELS) X-Window interface. The interaction between the user and ASCII-LIB is presented as well as the set of Hypercard Cards with which the user will work.
Analysis of the Navy’s Humanitarian Assistance and Disaster Relief Program Performance
2014-12-01
mortar and wood supports. (1) U.S. Government Response Shortly after the earthquake the president of Pakistan, President Musharraf made a formal...complicating coordination efforts. 3. Lessons Learned The USN has created and recently updated an online system for use as a repository of after action...I guess the military could somehow post online a list of projects they are doing and also put up a list of projects they want groups to do. This way
Selection of Batteries and Fuel Cells for Yucca Mountain Robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhye, R S
2003-12-08
The Performance Confirmation program of the Yucca Mountain Repository Development Project needs to employ remotely operated robots to work inside the emplacement drifts which will have an environment unsuitable for humans (radiation environment of up to 200 rad/hour (mostly gamma rays, some neutrons)) and maximum temperatures of 180 C. The robots will be required to operate inside the drifts for up to 8 hours per mission. Based on available functional requirements, we have developed the following specifications for the power needed by the robots:
OWLing Clinical Data Repositories With the Ontology Web Language
Pastor, Xavier; Lozano, Esther
2014-01-01
Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697
OWLing Clinical Data Repositories With the Ontology Web Language.
Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther
2014-08-01
The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.
Zhang, Melvyn W B; Ho, Roger C M
2017-01-01
Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slovic, P.; Layman, M.; Flynn, J.H.
1990-11-01
In July, 1989 the authors produced a report titled Perceived Risk, Stigma, and Potential Economic Impacts of a High-Level Nuclear-Waste Repository in Nevada (Slovic et al., 1989). That report described a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. It was concluded that adverse economic impacts potentially may result from two related social processes. Specifically, the study by Slovic et al. employed analyses of imagery in order to overcome concerns about the validity ofmore » direct questions regarding the influence of a nuclear-waste repository at Yucca Mountain upon a person`s future behaviors. During the latter months of 1989, data were collected in three major telephone surveys, designed to achieve the following objectives: (1) to replicate the results from the Phoenix, Arizona, surveys using samples from other populations that contribute to tourism, migration, and development in Nevada; (2) to retest the original Phoenix respondents to determine the stability of their images across an 18-month time period and to determine whether their vacation choices subsequent to the first survey were predictable from the images they produced in that original survey; (3) to elicit additional word-association images for the stimulus underground nuclear waste repository in order to determine whether the extreme negative images generated by the Phoenix respondents would occur with other samples of respondents; and (4) to develop and test a new method for imagery elicitation, based upon a rating technique rather than on word associations. 2 refs., 8 figs., 13 tabs.« less
NASA Astrophysics Data System (ADS)
Hawley, Chadwick T.
2009-05-01
The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The Enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.
Evaluation of Used Fuel Disposition in Clay-Bearing Rock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jové Colón, Carlos F.; Weck, Philippe F.; Sassani, David H.
2014-08-01
Radioactive waste disposal in shale/argillite rock formations has been widely considered given its desirable isolation properties (low permeability), geochemically reduced conditions, anomalous groundwater pressures, and widespread geologic occurrence. Clay/shale rock formations are characterized by their high content of clay minerals such as smectites and illites where diffusive transport and chemisorption phenomena predominate. These, in addition to low permeability, are key attributes of shale to impede radionuclide mobility. Shale host-media has been comprehensively studied in international nuclear waste repository programs as part of underground research laboratories (URLs) programs in Switzerland, France, Belgium, and Japan. These investigations, in some cases a decademore » or more long, have produced a large but fundamental body of information spanning from site characterization data (geological, hydrogeological, geochemical, geomechanical) to controlled experiments on the engineered barrier system (EBS) (barrier clay and seals materials). Evaluation of nuclear waste disposal in shale formations in the USA was conducted in the late 70’s and mid 80’s. Most of these studies evaluated the potential for shale to host a nuclear waste repository but not at the programmatic level of URLs in international repository programs. This report covers various R&D work and capabilities relevant to disposal of heat-generating nuclear waste in shale/argillite media. Integration and cross-fertilization of these capabilities will be utilized in the development and implementation of the shale/argillite reference case planned for FY15. Disposal R&D activities under the UFDC in the past few years have produced state-of-the-art modeling capabilities for coupled Thermal-Hydrological-Mechanical-Chemical (THMC), used fuel degradation (source term), and thermodynamic modeling and database development to evaluate generic disposal concepts. The THMC models have been developed for shale repository leveraging in large part on the information garnered in URLs and laboratory data to test and demonstrate model prediction capability and to accurately represent behavior of the EBS and the natural (barrier) system (NS). In addition, experimental work to improve our understanding of clay barrier interactions and TM couplings at high temperatures are key to evaluate thermal effects as a result of relatively high heat loads from waste and the extent of sacrificial zones in the EBS. To assess the latter, experiments and modeling approaches have provided important information on the stability and fate of barrier materials under high heat loads. This information is central to the assessment of thermal limits and the implementation of the reference case when constraining EBS properties and the repository layout (e.g., waste package and drift spacing). This report is comprised of various parts, each one describing various R&D activities applicable to shale/argillite media. For example, progress made on modeling and experimental approaches to analyze physical and chemical interactions affecting clay in the EBS, NS, and used nuclear fuel (source term) in support of R&D objectives. It also describes the development of a reference case for shale/argillite media. The accomplishments of these activities are summarized as follows: Development of a reference case for shale/argillite; Investigation of Reactive Transport and Coupled THM Processes in EBS: FY14; Update on Experimental Activities on Buffer/Backfill Interactions at elevated Pressure and Temperature; and Thermodynamic Database Development: Evaluation Strategy, Modeling Tools, First-Principles Modeling of Clay, and Sorption Database Assessment;ANL Mixed Potential Model For Used Fuel Degradation: Application to Argillite and Crystalline Rock Environments.« less
NASA Astrophysics Data System (ADS)
Gordon, S.; Dattore, E.; Williams, S.
2014-12-01
Even when a data center makes it's datasets accessible, they can still be hard to discover if the user is unaware of the laboratory or organization the data center supports. NCAR's Earth Observing Laboratory (EOL) is no exception. In response to this problem and as an inquiry into the feasibility of inter-connecting all of NCAR's repositories at a discovery layer, ESRI's Geoportal was researched. It was determined that an implementation of Geoportal would be a good choice to build a proof of concept model of inter-repository discovery around. This collaborative project between the University of Illinois and NCAR is coordinated through the Data Curation Education in Research Centers program. This program is funded by the Institute of Museum and Library Services.
Geoportal is open source software. It serves as an aggregation point for metadata catalogs of earth science datasets, with a focus on geospatial information. EOL's metadata is in static THREDDS catalogs. Geoportal can only create records from a THREDDS Data Server. The first step was to make EOL metadata more accessible by utilizing the ISO 19115-2 standard. It was also decided to create DIF records so EOL datasets could be ingested in NASA's Global Change Master Directory (GCMD).
To offer records for harvest, it was decided to develop an OAI-PMH server. To make a compliant server, the OAI_DC standard was also implemented. A server was written in Perl to serve a set of static records. We created a sample set of records in ISO 19115-2, FGDC, DIF, and OAI_DC. We utilized GCMD shared vocabularies to enhance discoverability and precision. The proof of concept was tested and verified by having another NCAR laboratory's Geoportal harvest our sample set.
To prepare for production, templates for each standard were developed and mapped to the database. These templates will help the automated creation of records. Once the OAI-PMH server is re-written in a Grails framework a dynamic representation of EOL's metadata will be available for harvest.
EOL will need to develop an implementation of a Geoportal and point GCMD to the OAI-PMH server. We will also seek out partnerships with other earth science and related discipline repositories that can communicate by OAI-PMH or Geoportal so that the scientific community will benefit from more discoverable data.
Life Sciences Data Archives (LSDA) in the Post-Shuttle Era
NASA Technical Reports Server (NTRS)
Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre
2010-01-01
Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management-the systematic use and reuse of information, experience, and expertise to achieve a specific goal-are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries for the science communities. Information about experiments and data, as well as non-attributable human data and data from other species' are available on our public Web site http://lsda.jsc.nasa.gov. The Web site also includes a repository for biospecimens, and a utilization process. NASA has undertaken an initiative to develop a Shuttle Data Archive repository. The Shuttle program is nearing its end in 2010 and it is critical that the medical and research data related to the Shuttle program be captured, retained, and usable for research, lessons learned, and future mission planning. Communities of practice are groups of people who share a concern or a passion for something they do, and learn how to do it better as they interact regularly. LSDA works with the HRP community of practice to ensure that we are preserving the relevant research and data they need in the LSDA repository. An evidence-based approach to risk management is required in space life sciences. Evidence changes over time. LSDA has a pilot project with Collexis, a new type of Web-based search engine. Collexis differentiates itself from full-text search engines by making use of thesauri for information retrieval. The high-quality search is based on semantics that have been defined in a life sciences ontology. Additionally, Collexis' matching technology is unique, allowing discovery of partially matching dicuments. Users do not have to construct a complicated (Boolean) search query, but can simply enter a free text search without the risk of getting "no results". Collexis may address these issues by virtue of its retrieval and discovery capabilities across multiple repositories.
Data Stewardship throughout the Ocean Research Data Life Cycle
NASA Astrophysics Data System (ADS)
Chandler, Cynthia; Groman, Robert; Allison, Molly; Wiebe, Peter; Glover, David
2013-04-01
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program (OPP ANT) at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. The end goals of the BCO-DMO are to ensure preservation of NSF funded project data and to provide open access to those data; achievement of those goals is attained through successful completion of a series of related phases. BCO-DMO has developed an end-to-end data stewardship process that includes all phases of the data life cycle: (1) providing data management advice to investigators during the proposal writing stage; (2) registering their funded project at BCO-DMO; (3) adding data and supporting documentation to the BCO-DMO data repository; (4) providing geospatial and text-based data access systems that support data discovery, access, display, assessment, integration, and export of data resources; (5) exploring mechanisms for exchange of data with complementary repositories; (6) publication of data sets to provide publishers of the peer-reviewed literature with citable references (Digital Object Identifiers) and to encourage proper citation and attribution of data sets in the future and (7) submission of final data sets for preservation in the appropriate long-term data archive. Strategic development of collaborative partnerships with complementary data management organizations is essential to sustainable coverage of the full data life cycle from research proposal through preservation of the final data products. Development and incorporation of controlled vocabularies, domain-specific ontologies and globally unique, persistent identifiers to unambiguously identify resources of interest curated by and available from BCO-DMO have significantly enabled progress toward interoperability with partner systems. Several important components have emerged from early collaborative relationships: (1) identifying a trusted authoritative source of complementary content and the appropriate contact; (2) determining the globally unique, persistent identifier for resources of interest and (3) negotiating the requisite syntactic and semantic exchange systems. An added benefit is the ability to use globally unique, persistent resource identifiers to identify and compare related content in other repositories, thus enabling us to improve the accuracy of content in the BCO-DMO data collection. Results from a recent community discussion at the January 2013 Federation of Earth Science Information Partners (ESIP) meeting will be presented. Mindful of the NSF EarthCube initiative in the United States, the ESIP discussion was an effort to identify commonalities and differences in the way different communities meet the challenges of data stewardship throughout the full data life cycle and to determine any gaps that currently exist. BCO-DMO: http://bco-dmo.org ESIP: http://esipfed.org/
Semantic Repositories for eGovernment Initiatives: Integrating Knowledge and Services
NASA Astrophysics Data System (ADS)
Palmonari, Matteo; Viscusi, Gianluigi
In recent years, public sector investments in eGovernment initiatives have depended on making more reliable existing governmental ICT systems and infrastructures. Furthermore, we assist at a change in the focus of public sector management, from the disaggregation, competition and performance measurements typical of the New Public Management (NPM), to new models of governance, aiming for the reintegration of services under a new perspective in bureaucracy, namely a holistic approach to policy making which exploits the extensive digitalization of administrative operations. In this scenario, major challenges are related to support effective access to information both at the front-end level, by means of highly modular and customizable content provision, and at the back-end level, by means of information integration initiatives. Repositories of information about data and services that exploit semantic models and technologies can support these goals by bridging the gap between the data-level representations and the human-level knowledge involved in accessing information and in searching for services. Moreover, semantic repository technologies can reach a new level of automation for different tasks involved in interoperability programs, both related to data integration techniques and service-oriented computing approaches. In this chapter, we discuss the above topics by referring to techniques and experiences where repositories based on conceptual models and ontologies are used at different levels in eGovernment initiatives: at the back-end level to produce a comprehensive view of the information managed in the public administrations' (PA) information systems, and at the front-end level to support effective service delivery.
Test Plan: WIPP bin-scale CH TRU waste tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molecke, M.A.
1990-08-01
This WIPP Bin-Scale CH TRU Waste Test program described herein will provide relevant composition and kinetic rate data on gas generation and consumption resulting from TRU waste degradation, as impacted by synergistic interactions due to multiple degradation modes, waste form preparation, long-term repository environmental effects, engineered barrier materials, and, possibly, engineered modifications to be developed. Similar data on waste-brine leachate compositions and potentially hazardous volatile organic compounds released by the wastes will also be provided. The quantitative data output from these tests and associated technical expertise are required by the WIPP Performance Assessment (PA) program studies, and for the scientificmore » benefit of the overall WIPP project. This Test Plan describes the necessary scientific and technical aspects, justifications, and rational for successfully initiating and conducting the WIPP Bin-Scale CH TRU Waste Test program. This Test Plan is the controlling scientific design definition and overall requirements document for this WIPP in situ test, as defined by Sandia National Laboratories (SNL), scientific advisor to the US Department of Energy, WIPP Project Office (DOE/WPO). 55 refs., 16 figs., 19 tabs.« less
NASA Astrophysics Data System (ADS)
Stall, S.
2016-12-01
To be trustworthy is to be reliable, dependable, honest, principled, ethical, incorruptible, and more. A trustworthy person demonstrates these qualities over time and under all circumstances. A trustworthy repository demonstrates these qualities through the team that manages the repository and its responsible organization. The requirements of a Trusted Digital Repository (TDR) in ISO 16363 can be tough to reach and tough to maintain. Challenges include: limited funds, limited resources and/or skills, and an unclear path to successfully achieve the requirements. The ISO standard defines each requirement separately, but a successful certification recognizes that there are many cross-dependencies among the requirements. Understanding these dependencies leads to a more efficient path towards success. At AGU we recognize that reaching the goal of the TDR ISO standard, or any set of data management objectives defined by an organization, has a better chance at success if the organization clearly knows their current capability, the improvements that are needed, and the best way to make (and maintain) those changes. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM model within the Earth and space sciences. Using the DMM, AGU developed a new Data Management Assessment Program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices to meet their goals - including becoming a Trustworthy Digital Repository. The requirements to achieve the TDR ISO standard are aligned to the data management best practices defined in the Data Management Maturity (DMM)SM model. Using the DMM as a process improvement tool in conjunction with the Data Management Assessment method, a team seeking the objective of the TDR ISO standard receives a clear road map to achieving their goal as an outcome of the assessment. Publishers and agencies are beginning to recommend or even require that repositories demonstrate that they are practicing best practices or meeting certain standards. Data preserved in a data facility that is working on achieving a TDR standard will have the level of care desired by the publishing community as well as the science community. Better Data Management results in Better Science.
Sanchez, Ana M; Denny, Thomas N; O'Gorman, Maurice
2014-07-01
This Special Issue of the Journal of Immunological Methods includes 16 manuscripts describing quality assurance activities related to virologic and immunologic monitoring of six global laboratory resource programs that support international HIV/AIDS clinical trial studies: Collaboration for AIDS Vaccine Discovery (CAVD); Center for HIV/AIDS Vaccine Immunology (CHAVI); External Quality Assurance Program Oversight Laboratory (EQAPOL); HIV Vaccine Trial Network (HVTN); International AIDS Vaccine Initiative (IAVI); and Immunology Quality Assessment (IQA). The reports from these programs address the many components required to develop comprehensive quality control activities and subsequent quality assurance programs for immune monitoring in global clinical trials including: all aspects of processing, storing, and quality assessment of PBMC preparations used ubiquitously in HIV clinical trials, the development and optimization of assays for CD8 HIV responses and HIV neutralization, a comprehensive global HIV virus repository, and reports on the development and execution of novel external proficiency testing programs for immunophenotyping, intracellular cytokine staining, ELISPOT and luminex based cytokine measurements. In addition, there are articles describing the implementation of Good Clinical Laboratory Practices (GCLP) in a large quality assurance laboratory, the development of statistical methods specific for external proficiency testing assessment, a discussion on the ability to set objective thresholds for measuring rare events by flow cytometry, and finally, a manuscript which addresses a framework for the structured reporting of T cell immune function based assays. It is anticipated that this series of manuscripts covering a wide range of quality assurance activities associated with the conduct of global clinical trials will provide a resource for individuals and programs involved in improving the harmonization, standardization, accuracy, and sensitivity of virologic and immunologic testing. Copyright © 2014 Elsevier B.V. All rights reserved.
Earley, Marie C; Laxova, Anita; Farrell, Philip M; Driscoll-Dunn, Rena; Cordovado, Suzanne; Mogayzel, Peter J; Konstan, Michael W; Hannon, W Harry
2011-07-15
CDC's Newborn Screening Quality Assurance Program collaborated with several U.S. Cystic Fibrosis Care Centers to collect specimens for development of a molecular CFTR proficiency testing program using dried-blood spots for newborn screening laboratories. Adult and adolescent patients or carriers donated whole blood that was aliquoted onto filter paper cards. Five blind-coded specimens were sent to participating newborn screening laboratories quarterly. Proficiency testing results were evaluated based on presumptive clinical assessment. Individual evaluations and summary reports were sent to each participating laboratory and technical consultations were offered if incorrect assessments were reported. The current CDC repository contains specimens with 39 different CFTR mutations. Up to 45 laboratories have participated in the program. Three years of data showed that correct assessments were reported 97.7% of the time overall when both mutations could be determined. Incorrect assessments that could have lead to a missed case occurred 0.9% of the time, and no information was reported 1.1% of the time due to sample failure. Results show that laboratories using molecular assays to detect CFTR mutations are performing satisfactorily. The programmatic results presented demonstrate the importance and complexity of providing proficiency testing for DNA-based assays. Published by Elsevier B.V.
Judson, Richard S.; Martin, Matthew T.; Egeghy, Peter; Gangwal, Sumit; Reif, David M.; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A.; Richard, Ann M.
2012-01-01
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases. PMID:22408426
Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M
2012-01-01
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.
An Assessment of a Science Discipline Archive Against ISO 16363
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2016-12-01
The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.
[Research resource network and Parkinson disease brain bank donor registration program in Japan].
Arima, Kunimasa
2010-10-01
In spite of the increasing need for brain tissue in biomedical research, overall brain banking activities in Japan has been lagging behind. On the initiative of the National Center of Neurology and Psychiatry, 2 projects have been carried out; the Research Resource Network (RRN) and the Parkinson's Disease Brain Bank (PDBB) donor registration program. RRN is a nation-wide network that links 15 brain repositories, and 1,463 autopsy brains have been registered in this network as of December 2009. The brain donor registration program for PDBB was established in 2006. A donor without cognitive impairment can enroll in this PDBB donor registration program. When the donor dies, the next-of-kin will contact the PDBB coordinators for subsequent autopsy services and brain retention. On obtaining the next-of-kin's consent at the time of donor's death, autopsy will be performed at PDBB collaborating hospitals of National Center of Neurology and Psychiatry, Juntendo University Hospital, and Tokyo Metropolitan Geriatric Hospital. In order to arouse public interest, lecture meetings for citizens have been held on a regular basis. Fifty individuals have registered in the PDBB donor registration program including 27 patients with PD, 4 patient with Parkinson syndrome, 1 patient with progressive supranuclear palsy, and 18 individuals without PD or related disorders as of December 2009. Autopsies have been performed for 2 of these donors. To promote brain banking activities,it is necessary to establish legal and ethical guidelines for the use of autopsied materials in biomedical research.
mHealthApps: A Repository and Database of Mobile Health Apps.
Xu, Wenlong; Liu, Yin
2015-03-18
The market of mobile health (mHealth) apps has rapidly evolved in the past decade. With more than 100,000 mHealth apps currently available, there is no centralized resource that collects information on these health-related apps for researchers in this field to effectively evaluate the strength and weakness of these apps. The objective of this study was to create a centralized mHealth app repository. We expect the analysis of information in this repository to provide insights for future mHealth research developments. We focused on apps from the two most established app stores, the Apple App Store and the Google Play Store. We extracted detailed information of each health-related app from these two app stores via our python crawling program, and then stored the information in both a user-friendly array format and a standard JavaScript Object Notation (JSON) format. We have developed a centralized resource that provides detailed information of more than 60,000 health-related apps from the Apple App Store and the Google Play Store. Using this information resource, we analyzed thousands of apps systematically and provide an overview of the trends for mHealth apps. This unique database allows the meta-analysis of health-related apps and provides guidance for research designs of future apps in the mHealth field.
Historical Topographic Map Collection bookmark
Fishburn, Kristin A.; Allord, Gregory J.
2017-06-29
The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the historical scanned maps in 2011, is to provide a digital repository of USGS topographic maps, available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of printed topographic maps, and prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic printing process. As the USGS continues the release of a new generation of topographic maps (US Topo) in electronic form, the topographic map remains an indispensable tool for government, science, industry, land management planning, and leisure.
International Collaboration Activities in Different Geologic Disposal Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkholzer, Jens
This report describes the current status of international collaboration regarding geologic disposal research in the Used Fuel Disposition (UFD) Campaign. Since 2012, in an effort coordinated by Lawrence Berkeley National Laboratory, UFD has advanced active collaboration with several international geologic disposal programs in Europe and Asia. Such collaboration allows the UFD Campaign to benefit from a deep knowledge base with regards to alternative repository environments developed over decades, and to utilize international investments in research facilities (such as underground research laboratories), saving millions of R&D dollars that have been and are being provided by other countries. To date, UFD’s Internationalmore » Disposal R&D Program has established formal collaboration agreements with five international initiatives and several international partners, and national lab scientists associated with UFD have conducted specific collaborative R&D activities that align well with its R&D priorities.« less
Scanning and georeferencing historical USGS quadrangles
Fishburn, Kristin A.; Davis, Larry R.; Allord, Gregory J.
2017-06-23
The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the Historical Topographic Map Collection in 2011, is to provide access to a digital repository of USGS topographic maps that is available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of traditional topographic maps, and, prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic process. The next generation of topographic maps, US Topo, is being released by the USGS in digital form, and newer technologies make it possible to also deliver historical maps in the same electronic format that is more publicly accessible.
Nuclear energy related capabilities at Sandia National Laboratories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pickering, Susan Y.
2014-02-01
Sandia National Laboratories' technology solutions are depended on to solve national and global threats to peace and freedom. Through science and technology, people, infrastructure, and partnerships, part of Sandia's mission is to meet the national needs in the areas of energy, climate and infrastructure security. Within this mission to ensure clean, abundant, and affordable energy and water is the Nuclear Energy and Fuel Cycle Programs. The Nuclear Energy and Fuel Cycle Programs have a broad range of capabilities, with both physical facilities and intellectual expertise. These resources are brought to bear upon the key scientific and engineering challenges facing themore » nation and can be made available to address the research needs of others. Sandia can support the safe, secure, reliable, and sustainable use of nuclear power worldwide by incorporating state-of-the-art technologies in safety, security, nonproliferation, transportation, modeling, repository science, and system demonstrations.« less
ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing
Rusakov, Dmitri A.; Savtchenko, Leonid P.
2017-01-01
Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877
The Particle-in-Cell and Kinetic Simulation Software Center
NASA Astrophysics Data System (ADS)
Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.
2017-10-01
The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
Haverkamp, Jacqueline J; Vogt, Marjorie
2015-01-01
Portfolios have been used in higher education for the past several years for assessment of student learning and growth and serve as the basis for summative and formative evaluations. While there is some information in the literature on how undergraduate and graduate medical, nursing, and allied health students might use portfolios to showcase acquired knowledge and skills, there is a dearth of information on the use of e-Portfolios with students in doctor of nursing practice programs. There are also limited findings regarding the creative use of technology (that includes infographics and other multimedia tools) to enhance learning outcomes (Stephens & Parr, 2013). This article presents engaging and meaningful ways technology can be used within e-Portfolios. Thus, e-Portfolios become more than a repository for academic evidence; they become unique stories that reflect the breadth and depth of students' learner-centered outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.
Puzzler Solution: Perfect Weather for a Picnic | Poster
It looks like we stumped you. We did not receive any correct guesses for the current Poster Puzzler, which is an image of the top of the Building 434 picnic table, with a view looking towards Building 472. This picnic table and others across campus were supplied by the NCI at Frederick Campus Improvement Committee. Building 434, located on Wood Street, is home to the staff of Scientific Publications, Graphics & Media (SPGM), the Central Repository, and the NCI Experimental Therapeutics Program support group, Applied and Developmental Research Directorate.
2017-08-01
This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of
Louisiana: a model for advancing regional e-Research through cyberinfrastructure.
Katz, Daniel S; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott
2009-06-28
Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date.
2013-09-01
the order from the DLA; convenes a meeting with tech librarians , engineers, machinists, quality assurance (QA) inspectors, and mechanics to assess...created, begins the in-house process. 3. Research of Technical Drawings The tech librarian reviews the applicable repository for any tech drawings...applicable to Widget A. If none are found, the tech librarian contacts the OEM and other D-Level activities to find out whether the tech drawing is out
Steinkampf, W.C.
2000-01-01
Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program, The first day focuses on the regional seeing with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be in southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The filed trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, and element of the hydrologic system that historically has received little attention. Discussions during the second day will comprise selected topics of Yucca Mountain geology, mic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.
Shea, Katheryn E; Wagner, Elizabeth L; Marchesani, Leah; Meagher, Kevin; Giffen, Carol
2017-02-01
Reducing costs by improving storage efficiency has been a focus of the National Heart, Lung, and Blood Institute (NHLBI) Biologic Specimen Repository (Biorepository) and Biologic Specimen and Data Repositories Information Coordinating Center (BioLINCC) programs for several years. Study specimen profiles were compiled using the BioLINCC collection catalog. Cost assessments and calculations on the return on investments to consolidate or reduce a collection, were developed and implemented. Over the course of 8 months, the NHLBI Biorepository evaluated 35 collections that consisted of 1.8 million biospecimens. A total of 23 collections were selected for consolidation, with a total of 1.2 million specimens located in 21,355 storage boxes. The consolidation resulted in a savings of 4055 boxes of various sizes and 10.2 mechanical freezers (∼275 cubic feet) worth of space. As storage costs in a biorepository increase over time, the development and use of information technology tools to assess the potential advantage and feasiblity of vial consolidation can reduce maintenance expenses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francis, A.J.; Gillow, J.B.
1993-09-01
Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term (< 6 months) and long-term (> 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and undergroundmore » workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation.« less
NELS 2.0 - A general system for enterprise wide information management
NASA Technical Reports Server (NTRS)
Smith, Stephanie L.
1993-01-01
NELS, the NASA Electronic Library System, is an information management tool for creating distributed repositories of documents, drawings, and code for use and reuse by the aerospace community. The NELS retrieval engine can load metadata and source files of full text objects, perform natural language queries to retrieve ranked objects, and create links to connect user interfaces. For flexibility, the NELS architecture has layered interfaces between the application program and the stored library information. The session manager provides the interface functions for development of NELS applications. The data manager is an interface between session manager and the structured data system. The center of the structured data system is the Wide Area Information Server. This system architecture provides access to information across heterogeneous platforms in a distributed environment. There are presently three user interfaces that connect to the NELS engine; an X-Windows interface, and ASCII interface and the Spatial Data Management System. This paper describes the design and operation of NELS as an information management tool and repository.
NASA Astrophysics Data System (ADS)
Acuña, M.
The International Solar Terrestrial Physics Program (ISTP) evolved from the individual plans of US, Japanese and European countries to develop space missions to expand our knowledge of the Sun-Earth connection as a "system". Previous experience with independent missions amply illustrated the critical need for coordinated and simultaneous observations in key regions of Sun-Earth space in order to resolve time-space ambiguities and cause-effect relationships. Mission studies such as the US Origins of Plasmas in the Earth's Neighborhood (OPEN), Geotail in Japan, the Solar Heliospheric Observatory in Europe and the Regatta and other magnetospheric missions in the former Soviert Union, formed the early conceptual elements that eventually led to the ISTP program. The coordinating role developed by the Inter-Agency-Consultative-Group (IACG) integrated by NASA, ESA, ISAS and IKI and demonstrated during the comet Halley apparition in 1986, was continued to include solar-terrestrial research and the mission elements described above. In addition to the space elements, a most important component of the coordination effort was the inclusion of data networks, analysis and planning tools as well as globally accessible data sets by the scientific community at large. This approach enabled the active and direct participation of scientists in developing countries in one of the most comprehensive solar-terrestrial research programs implemented to date. The creation of multiple ISTP data repositories throughout the world has enabled a large number of scientists in developing countries to have direct access to the latest spacecraft observations and a most fruitful interaction with fellow researchers throughout the world. This paper will present a review of the evolution of the ISTP program, its products, analysis tools, data bases, infrastructure and lessons learned applicable to future international collaborative programs.
Waste isolation safety assessment program. Task 4. Third contractor information meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-06-01
The Contractor Information Meeting (October 14 to 17, 1979) was part of the FY-1979 effort of Task 4 of the Waste Isolation Safety Assessment Program (WISAP): Sorption/Desorption Analysis. The objectives of this task are to: evaluate sorption/desorption measurement methods and develop a standardized measurement procedure; produce a generic data bank of nuclide-geologic interactions using a wide variety of geologic media and groundwaters; perform statistical analysis and synthesis of these data; perform validation studies to compare short-term laboratory studies to long-term in situ behavior; develop a fundamental understanding of sorption/desorption processes; produce x-ray and gamma-emitting isotopes suitable for the study ofmore » actinides at tracer concentrations; disseminate resulting information to the international technical community; and provide input data support for repository safety assessment. Conference participants included those subcontracted to WISAP Task 4, representatives and independent subcontractors to the Office of Nuclear Waste Isolation, representatives from other waste disposal programs, and experts in the area of waste/geologic media interaction. Since the meeting, WISAP has been divided into two programs: Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) (modeling efforts) and Waste/Rock Interactions Technology (WRIT) (experimental work). The WRIT program encompasses the work conducted under Task 4. This report contains the information presented at the Task 4, Third Contractor Information Meeting. Technical Reports from the subcontractors, as well as Pacific Northwest Laboratory (PNL), are provided along with transcripts of the question-and-answer sessions. The agenda and abstracts of the presentations are also included. Appendix A is a list of the participants. Appendix B gives an overview of the WRIT program and details the WRIT work breakdown structure for 1980.« less
Kim, Brian J; Merchant, Madhur; Zheng, Chengyi; Thomas, Anil A; Contreras, Richard; Jacobsen, Steven J; Chien, Gary W
2014-12-01
Natural language processing (NLP) software programs have been widely developed to transform complex free text into simplified organized data. Potential applications in the field of medicine include automated report summaries, physician alerts, patient repositories, electronic medical record (EMR) billing, and quality metric reports. Despite these prospects and the recent widespread adoption of EMR, NLP has been relatively underutilized. The objective of this study was to evaluate the performance of an internally developed NLP program in extracting select pathologic findings from radical prostatectomy specimen reports in the EMR. An NLP program was generated by a software engineer to extract key variables from prostatectomy reports in the EMR within our healthcare system, which included the TNM stage, Gleason grade, presence of a tertiary Gleason pattern, histologic subtype, size of dominant tumor nodule, seminal vesicle invasion (SVI), perineural invasion (PNI), angiolymphatic invasion (ALI), extracapsular extension (ECE), and surgical margin status (SMS). The program was validated by comparing NLP results to a gold standard compiled by two blinded manual reviewers for 100 random pathology reports. NLP demonstrated 100% accuracy for identifying the Gleason grade, presence of a tertiary Gleason pattern, SVI, ALI, and ECE. It also demonstrated near-perfect accuracy for extracting histologic subtype (99.0%), PNI (98.9%), TNM stage (98.0%), SMS (97.0%), and dominant tumor size (95.7%). The overall accuracy of NLP was 98.7%. NLP generated a result in <1 second, whereas the manual reviewers averaged 3.2 minutes per report. This novel program demonstrated high accuracy and efficiency identifying key pathologic details from the prostatectomy report within an EMR system. NLP has the potential to assist urologists by summarizing and highlighting relevant information from verbose pathology reports. It may also facilitate future urologic research through the rapid and automated creation of large databases.
The impact of using mobile-enabled devices on patient engagement in remote monitoring programs.
Agboola, Stephen; Havasy, Rob; Myint-U, Khinlei; Kvedar, Joseph; Jethwani, Kamal
2013-05-01
Different types of data transmission technologies are used in remote monitoring (RM) programs. This study reports on a retrospective analysis of how participants engage, based on the type of data transfer technology used in a blood pressure (BP) RM program, and its potential impact on RM program design and outcomes. Thirty patients, aged 23-84 years (62 ± 14 years), who had completed at least 2 months in the program and were not participating in any other clinical trial were identified from the Remote Monitoring Data Repository. Half of these patients used wireless-based data transfer devices [wireless-based device (WBD)] while the other half used telephone modem-based data transfer devices [modem-based device (MBD)]. Participants were matched by practice and age. Engagement indices, which include frequency of BP measurements, frequency of data uploads, time to first BP measurement, and time to first data upload, were compared in both groups using the Wilcoxon-Mann-Whitney two-sample rank-sum test. Help desk call data were analyzed by Chi square test. The frequency of BP measurements and data uploads was significantly higher in the WBD group versus the MBD group [median = 0.66 versus 0.2 measurements/day (p = .01) and 0.46 versus 0.01 uploads/day (p < .001), respectively]. Time to first upload was significantly lower in the WBD group (median = 4 versus 7 days; p = .02), but time to first BP measurement did not differ between the two groups (median = 2 versus 1 day; p = .98). Wireless transmission ensures instantaneous transmission of readings, providing clinicians timely data to intervene on. Our findings suggest that mobile-enabled wireless technologies can positively impact patient engagement, outcomes, and operational workflow in RM programs. © 2013 Diabetes Technology Society.
PGOPHER in the Classroom and the Laboratory
NASA Astrophysics Data System (ADS)
Western, Colin
2015-06-01
PGOPHER is a general purpose program for simulating and fitting rotational, vibrational and electronic spectra. As it uses a graphical user interface the basic operation is sufficiently straightforward to make it suitable for use in undergraduate practicals and computer based classes. This talk will present two experiments that have been in regular use by Bristol undergraduates for some years based on the analysis of infra-red spectra of cigarette smoke and, for more advanced students, visible and near ultra-violet spectra of a nitrogen discharge and a hydrocarbon flame. For all of these the rotational structure is analysed and used to explore ideas of bonding. The talk will discuss the requirements for the apparatus and the support required. Other ideas for other possible experiments and computer based exercises will also be presented, including a group exercise. The PGOPHER program is open source, and is available for Microsoft Windows, Apple Mac and Linux. It can be freely downloaded from the supporting website http://pgopher.chm.bris.ac.uk. The program does not require any installation process, so can be run on student's own machines or easily setup on classroom or laboratory computers. PGOPHER, a Program for Simulating Rotational, Vibrational and Electronic Structure, C. M. Western, University of Bristol, http://pgopher.chm.bris.ac.uk PGOPHER version 8.0, C M Western, 2014, University of Bristol Research Data Repository, doi:10.5523/bris.huflggvpcuc1zvliqed497r2
Daneshian, Mardas; Akbarsha, Mohammad A; Blaauboer, Bas; Caloni, Francesca; Cosson, Pierre; Curren, Rodger; Goldberg, Alan; Gruber, Franz; Ohl, Frauke; Pfaller, Walter; van der Valk, Jan; Vinardell, Pilar; Zurlo, Joanne; Hartung, Thomas; Leist, Marcel
2011-01-01
Development of improved communication and education strategies is important to make alternatives to the use of animals, and the broad range of applications of the 3Rs concept better known and understood by different audiences. For this purpose, the Center for Alternatives to Animal Testing in Europe (CAAT-Europe) together with the Transatlantic Think Tank for Toxicology (t(4)) hosted a three-day workshop on "Teaching Alternative Methods to Animal Experimentation". A compilation of the recommendations by a group of international specialists in the field is summarized in this report. Initially, the workshop participants identified the different audience groups to be addressed and also the communication media that may be used. The main outcome of the workshop was a framework for a comprehensive educational program. The modular structure of the teaching program presented here allows adaptation to different audiences with their specific needs; different time schedules can be easily accommodated on this basis. The topics cover the 3Rs principle, basic research, toxicological applications, method development and validation, regulatory aspects, case studies and ethical aspects of 3Rs approaches. This expert consortium agreed to generating teaching materials covering all modules and providing them in an open access online repository.
Recent developments - US spent fuel disposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
One of a US utility's major risk factors in continuing to operate a nuclear plant is managing discharged spent fuel. The US Department of Energy (DOE) signed contracts with utilities guaranteeing government acceptance of spent fuel by 1988. However, on December 17, 1992, DOE Secretary Watkins wrote to Sen. J. Bennett Johnston (D-LA), Chairman of the Senate Energy Committee, indicating a reassessment of DOE's programs, the results of which will be presented to Congress in January 1993. He indicated the Department may not be able to meet the 1988 date, because of difficulty in finding a site for the Monitoredmore » Retrievable Storage facility. Watkins indicated that DOE has investigated an interim solution and decided to expedite a program to certify a multi-purpose standardized cask system for spent fuel receipt, storage, transport, and disposal. To meet the expectations of US utilities, DOE is considering a plan to use federal sites for interim storage of the casks. Secretary Watkins recommended the waste program be taken off-budget and put in a revolving fund established to ensure that money already collected from utilities will be available to meet the schedule for completion of the repository.« less
Issues to consider in the derivation of water quality benchmarks for the protection of aquatic life.
Schneider, Uwe
2014-01-01
While water quality benchmarks for the protection of aquatic life have been in use in some jurisdictions for several decades (USA, Canada, several European countries), more and more countries are now setting up their own national water quality benchmark development programs. In doing so, they either adopt an existing method from another jurisdiction, update on an existing approach, or develop their own new derivation method. Each approach has its own advantages and disadvantages, and many issues have to be addressed when setting up a water quality benchmark development program or when deriving a water quality benchmark. Each of these tasks requires a special expertise. They may seem simple, but are complex in their details. The intention of this paper was to provide some guidance for this process of water quality benchmark development on the program level, for the derivation methodology development, and in the actual benchmark derivation step, as well as to point out some issues (notably the inclusion of adapted populations and cryptic species and points to consider in the use of the species sensitivity distribution approach) and future opportunities (an international data repository and international collaboration in water quality benchmark development).
Smoothing Data Friction through building Service Oriented Data Platforms
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Richards, C. J.; Evans, B. J. K.; Wang, J.; Druken, K. A.
2017-12-01
Data Friction has been commonly defined as the costs in time, energy and attention required to simply collect, check, store, move, receive, and access data. On average, researchers spend a significant fraction of their time finding the data for their research project and then reformatting it so that it can be used by the software application of their choice. There is an increasing role for both data repositories and software to be modernised to help reduce data friction in ways that support the better use of the data. Many generic data repositories simply accept data in the format as supplied: the key check is that the data have sufficient metadata to enable discovery and download. Few generic repositories have both the expertise and infrastructure to support the multiple domain specific requirements that facilitate the increasing need for integration and reusability. In contrast, major science domain-focused repositories are increasingly able to implement and enforce community endorsed best practices and guidelines that ensure reusability and harmonization of data for use within the community by offering semi-automated QC workflows to improve quality of submitted data. The most advanced of these science repositories now operate as service-oriented data platforms that extend the use of data across domain silos and increasingly provide server-side programmatically-enabled access to data via network protocols and community standard APIs. To provide this, more rigorous QA/QC procedures are needed to validate data against standards and community software and tools. This ensures that the data can be accessed in expected ways and also demonstrates that the data works across different (non-domain specific) packages, tools and programming languages deployed by the various user communities. In Australia, the National Computational Infrastructure (NCI) has created such a service-oriented data platform which is demonstrating how this approach can reduce data friction, servicing both individual domains as well as facilitating cross-domain collaboration. The approach has required an increase in effort for the repository to provide the additional expertise, so as to enable a better capability and efficient system which ultimately saves time by the individual researcher.
The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories
NASA Astrophysics Data System (ADS)
Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.
2017-12-01
SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R
NASA Astrophysics Data System (ADS)
Ismail, A. E.; Xiong, Y.; Nowak, E. J.; Brush, L. H.
2009-12-01
The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy (DOE) repository in southeast New Mexico for defense-related transuranic (TRU) waste. Every five years, the DOE is required to submit an application to the Environmental Protection Agency (EPA) demonstrating the WIPP’s continuing compliance with the applicable EPA regulations governing the repository. Part of this recertification effort involves a performance assessment—a probabilistic evaluation of the repository performance with respect to regulatory limits on the amount of releases from the repository to the accessible environment. One of the models used as part of the performance assessment process is a geochemistry model, which predicts solubilities of the radionuclides in the brines that may enter the repository in the different scenarios considered by the performance assessment. The dissolved actinide source term comprises actinide solubilities, which are input parameters for modeling the transport of radionuclides as a result of brine flow through and from the repository. During a performance assessment, the solubilities are modeled as the product of a “base” solubility determined from calculations based on the chemical conditions expected in the repository, and an uncertainty factor that describes the potential deviations of the model from expected behavior. We will focus here on a discussion of the uncertainties. To compute a cumulative distribution function (CDF) for the uncertainties, we compare published, experimentally measured solubility data to predictions made using the established WIPP geochemistry model. The differences between the solubilities observed for a given experiment and the calculated solubilities from the model are used to form the overall CDF, which is then sampled as part of the performance assessment. We will discuss the methodology used to update the CDF’s for the +III actinides, obtained from data for Nd, Am, and Cm, and the +IV actinides, obtained from data for Th, and present results for the calculations of the updated CDF’s. We compare the CDF’s to the distributions computed for the previous recertification, and discuss the potential impact of the changes on the geochemistry model. This research is funded by WIPP programs administered by the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.
ACToR: Aggregated Computational Toxicology Resource (T) ...
The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col
NASA Astrophysics Data System (ADS)
Stall, S.
2015-12-01
Much earth and space science data and metadata are managed and supported by an infrastructure of repositories, ranging from large agency or instrument facilities, to institutions, to smaller repositories including labs. Scientists face many challenges in this ecosystem both on storing their data and in accessing data from others for new research. Critical for all uses is ensuring the credibility and integrity of the data and conveying that and provenance information now and in the future. Accurate information is essential for future researchers to find (or discover) the data, evaluate the data for use (content, temporal, geolocation, precision) and finally select (or discard) that data as meeting a "fit-for-purpose" criteria. We also need to optimize the effort it takes in describing the data for these determinations, which means making it efficient for the researchers who collect the data. At AGU we are developing a program aimed at helping repositories, and thereby researchers, improve data quality and data usability toward these goals. AGU has partnered with the CMMI Institute to develop their Data Management Maturity (DMM) framework within the Earth and space sciences. The CMMI DMM framework guides best practices in a range of data operations, and the application of the DMM, through an assessment, reveals how repositories and institutions can best optimize efforts to improve operations and functionality throughout the data lifecycle and elevate best practices across a variety of data management operations. Supporting processes like data operations, data governance, and data architecture are included. An assessment involves identifying accomplishment, and weaknesses compared to leading practices for data management. Broad application of the DMM can help improve quality in data and operations, and consistency across the community that will facilitate interoperability, discovery, preservation, and reuse. Good data can be better data. Consistency results in sustainability.
Granite disposal of U.S. high-level radioactive waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, Geoffrey A.; Mariner, Paul E.; Lee, Joon H.
This report evaluates the feasibility of disposing U.S. high-level radioactive waste in granite several hundred meters below the surface of the earth. The U.S. has many granite formations with positive attributes for permanent disposal. Similar crystalline formations have been extensively studied by international programs, two of which, in Sweden and Finland, are the host rocks of submitted or imminent repository license applications. This report is enabled by the advanced work of the international community to establish functional and operational requirements for disposal of a range of waste forms in granite media. In this report we develop scoping performance analyses, basedmore » on the applicable features, events, and processes (FEPs) identified by international investigators, to support generic conclusions regarding post-closure safety. Unlike the safety analyses for disposal in salt, shale/clay, or deep boreholes, the safety analysis for a mined granite repository depends largely on waste package preservation. In crystalline rock, waste packages are preserved by the high mechanical stability of the excavations, the diffusive barrier of the buffer, and favorable chemical conditions. The buffer is preserved by low groundwater fluxes, favorable chemical conditions, backfill, and the rigid confines of the host rock. An added advantage of a mined granite repository is that waste packages would be fairly easy to retrieve, should retrievability be an important objective. The results of the safety analyses performed in this study are consistent with the results of comprehensive safety assessments performed for sites in Sweden, Finland, and Canada. They indicate that a granite repository would satisfy established safety criteria and suggest that a small number of FEPs would largely control the release and transport of radionuclides. In the event the U.S. decides to pursue a potential repository in granite, a detailed evaluation of these FEPs would be needed to inform site selection and safety assessment.« less
Burchill, C; Roos, L L; Fergusson, P; Jebamani, L; Turner, K; Dueck, S
2000-01-01
Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research Methods, and facilitate both internal communication and collaboration with other sites. This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated.
NASA Technical Reports Server (NTRS)
Thomas, D.; Fitts, M.; Wear, M.; VanBaalen, M.
2011-01-01
As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health Repository (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future research studies.
Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen
2000-01-01
Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929
NASA Astrophysics Data System (ADS)
Rack, F. R.
2005-12-01
The Integrated Ocean Drilling Program (IODP: 2003-2013 initial phase) is the successor to the Deep Sea Drilling Project (DSDP: 1968-1983) and the Ocean Drilling Program (ODP: 1985-2003). These earlier scientific drilling programs amassed collections of sediment and rock cores (over 300 kilometers stored in four repositories) and data organized in distributed databases and in print or electronic publications. International members of the IODP have established, through memoranda, the right to have access to: (1) all data, samples, scientific and technical results, all engineering plans, data or other information produced under contract to the program; and, (2) all data from geophysical and other site surveys performed in support of the program which are used for drilling planning. The challenge that faces the individual platform operators and management of IODP is to find the right balance and appropriate synergies among the needs, expectations and requirements of stakeholders. The evolving model for IODP database services consists of the management and integration of data collected onboard the various IODP platforms (including downhole logging and syn-cruise site survey information), legacy data from DSDP and ODP, data derived from post-cruise research and publications, and other IODP-relevant information types, to form a common, program-wide IODP information system (e.g., IODP Portal) which will be accessible to both researchers and the public. The JANUS relational database of ODP was introduced in 1997 and the bulk of ODP shipboard data has been migrated into this system, which is comprised of a relational data model consisting of over 450 tables. The JANUS database includes paleontological, lithostratigraphic, chemical, physical, sedimentological, and geophysical data from a global distribution of sites. For ODP Legs 100 through 210, and including IODP Expeditions 301 through 308, JANUS has been used to store data from 233,835 meters of core recovered, which are comprised of 38,039 cores, with 202,281 core sections stored in repositories, which have resulted in the taking of 2,299,180 samples for scientists and other users (http://iodp.tamu.edu/janusweb/general/dbtable.cgi). JANUS and other IODP databases are viewed as components of an evolving distributed network of databases, supported by metadata catalogs and middleware with XML workflows, that are intended to provide access to DSDP/ODP/IODP cores and sample-based data as well as other distributed geoscience data collections (e.g., CHRONOS, PetDB, SedDB). These data resources can be explored through the use of emerging data visualization environments, such as GeoWall, CoreWall (http://(www.evl.uic.edu/cavern/corewall), a multi-screen display for viewing cores and related data, GeoWall-2 and LambdaVision, a very-high resolution, networked environment for data exploration and visualization, and others. The U.S Implementing Organization (USIO) for the IODP, also known as the JOI Alliance, is a partnership between Joint Oceanographic Institutions (JOI), Texas A&M University, and Lamont-Doherty Earth Observatory of Columbia University. JOI is a consortium of 20 premier oceanographic research institutions that serves the U.S. scientific community by leading large-scale, global research programs in scientific ocean drilling and ocean observing. For more than 25 years, JOI has helped facilitate discovery and advance global understanding of the Earth and its oceans through excellence in program management.
NASA Astrophysics Data System (ADS)
Benedict, K. K.; Servilla, M. S.; Vanderbilt, K.; Wheeler, J.
2015-12-01
The growing volume, variety and velocity of production of Earth science data magnifies the impact of inefficiencies in data acquisition, processing, analysis, and sharing workflows, potentially to the point of impairing the ability of researchers to accomplish their desired scientific objectives. The adaptation of agile software development principles (http://agilemanifesto.org/principles.html) to data curation processes has significant potential to lower barriers to effective scientific data discovery and reuse - barriers that otherwise may force the development of new data to replace existing but unusable data, or require substantial effort to make data usable in new research contexts. This paper outlines a data curation process that was developed at the University of New Mexico that provides a cross-walk of data and associated documentation between the data archive developed by the Long Term Ecological Research (LTER) Network Office (PASTA - http://lno.lternet.edu/content/network-information-system) and UNM's institutional repository (LoboVault - http://repository.unm.edu). The developed automated workflow enables the replication of versioned data objects and their associated standards-based metadata between the LTER system and LoboVault - providing long-term preservation for those data/metadata packages within LoboVault while maintaining the value-added services that the PASTA platform provides. The relative ease with which this workflow was developed is a product of the capabilities independently developed on both platforms - including the simplicity of providing a well-documented application programming interface (API) for each platform enabling scripted interaction and the use of well-established documentation standards (EML in the case of PASTA, Dublin Core in the case of LoboVault) by both systems. These system characteristics, when combined with an iterative process of interaction between the Data Curation Librarian (on the LoboVault side of the process), the Sevilleta LTER Information Manager and the LTER Network Information System developer, yielded a rapid and relatively streamlined process for targeted replication of data and metadata between the two systems - increasing the discoverability and usability of the LTER data assets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ECONOMY,KATHLEEN M.; HELTON,JON CRAIG; VAUGHN,PALMER
1999-10-01
The Waste Isolation Pilot Plant (WIPP), which is located in southeastern New Mexico, is being developed for the geologic disposal of transuranic (TRU) waste by the U.S. Department of Energy (DOE). Waste disposal will take place in panels excavated in a bedded salt formation approximately 2000 ft (610 m) below the land surface. The BRAGFLO computer program which solves a system of nonlinear partial differential equations for two-phase flow, was used to investigate brine and gas flow patterns in the vicinity of the repository for the 1996 WIPP performance assessment (PA). The present study examines the implications of modeling assumptionsmore » used in conjunction with BRAGFLO in the 1996 WIPP PA that affect brine and gas flow patterns involving two waste regions in the repository (i.e., a single waste panel and the remaining nine waste panels), a disturbed rock zone (DRZ) that lies just above and below these two regions, and a borehole that penetrates the single waste panel and a brine pocket below this panel. The two waste regions are separated by a panel closure. The following insights were obtained from this study. First, the impediment to flow between the two waste regions provided by the panel closure model is reduced due to the permeable and areally extensive nature of the DRZ adopted in the 1996 WIPP PA, which results in the DRZ becoming an effective pathway for gas and brine movement around the panel closures and thus between the two waste regions. Brine and gas flow between the two waste regions via the DRZ causes pressures between the two to equilibrate rapidly, with the result that processes in the intruded waste panel are not isolated from the rest of the repository. Second, the connection between intruded and unintruded waste panels provided by the DRZ increases the time required for repository pressures to equilibrate with the overlying and/or underlying units subsequent to a drilling intrusion. Third, the large and areally extensive DRZ void volumes is a significant source of brine to the repository, which is consumed in the corrosion of iron and thus contributes to increased repository pressures. Fourth, the DRZ itself lowers repository pressures by providing storage for gas and access to additional gas storage in areas of the repository. Fifth, given the pathway that the DRZ provides for gas and brine to flow around the panel closures, isolation of the waste panels by the panel closures was not essential to compliance with the U.S. Environment Protection Agency's regulations in the 1996 WIPP PA.« less
GRAbB: Selective Assembly of Genomic Regions, a New Niche for Genomic Research
Zhang, Hao; van Diepeningen, Anne D.; van der Lee, Theo A. J.; Waalwijk, Cees; de Hoog, G. Sybren
2016-01-01
GRAbB (Genomic Region Assembly by Baiting) is a new program that is dedicated to assemble specific genomic regions from NGS data. This approach is especially useful when dealing with multi copy regions, such as mitochondrial genome and the rDNA repeat region, parts of the genome that are often neglected or poorly assembled, although they contain interesting information from phylogenetic or epidemiologic perspectives, but also single copy regions can be assembled. The program is capable of targeting multiple regions within a single run. Furthermore, GRAbB can be used to extract specific loci from NGS data, based on homology, like sequences that are used for barcoding. To make the assembly specific, a known part of the region, such as the sequence of a PCR amplicon or a homologous sequence from a related species must be specified. By assembling only the region of interest, the assembly process is computationally much less demanding and may lead to assemblies of better quality. In this study the different applications and functionalities of the program are demonstrated such as: exhaustive assembly (rDNA region and mitochondrial genome), extracting homologous regions or genes (IGS, RPB1, RPB2 and TEF1a), as well as extracting multiple regions within a single run. The program is also compared with MITObim, which is meant for the exhaustive assembly of a single target based on a similar query sequence. GRAbB is shown to be more efficient than MITObim in terms of speed, memory and disk usage. The other functionalities (handling multiple targets simultaneously and extracting homologous regions) of the new program are not matched by other programs. The program is available with explanatory documentation at https://github.com/b-brankovics/grabb. GRAbB has been tested on Ubuntu (12.04 and 14.04), Fedora (23), CentOS (7.1.1503) and Mac OS X (10.7). Furthermore, GRAbB is available as a docker repository: brankovics/grabb (https://hub.docker.com/r/brankovics/grabb/). PMID:27308864
GRAbB: Selective Assembly of Genomic Regions, a New Niche for Genomic Research.
Brankovics, Balázs; Zhang, Hao; van Diepeningen, Anne D; van der Lee, Theo A J; Waalwijk, Cees; de Hoog, G Sybren
2016-06-01
GRAbB (Genomic Region Assembly by Baiting) is a new program that is dedicated to assemble specific genomic regions from NGS data. This approach is especially useful when dealing with multi copy regions, such as mitochondrial genome and the rDNA repeat region, parts of the genome that are often neglected or poorly assembled, although they contain interesting information from phylogenetic or epidemiologic perspectives, but also single copy regions can be assembled. The program is capable of targeting multiple regions within a single run. Furthermore, GRAbB can be used to extract specific loci from NGS data, based on homology, like sequences that are used for barcoding. To make the assembly specific, a known part of the region, such as the sequence of a PCR amplicon or a homologous sequence from a related species must be specified. By assembling only the region of interest, the assembly process is computationally much less demanding and may lead to assemblies of better quality. In this study the different applications and functionalities of the program are demonstrated such as: exhaustive assembly (rDNA region and mitochondrial genome), extracting homologous regions or genes (IGS, RPB1, RPB2 and TEF1a), as well as extracting multiple regions within a single run. The program is also compared with MITObim, which is meant for the exhaustive assembly of a single target based on a similar query sequence. GRAbB is shown to be more efficient than MITObim in terms of speed, memory and disk usage. The other functionalities (handling multiple targets simultaneously and extracting homologous regions) of the new program are not matched by other programs. The program is available with explanatory documentation at https://github.com/b-brankovics/grabb. GRAbB has been tested on Ubuntu (12.04 and 14.04), Fedora (23), CentOS (7.1.1503) and Mac OS X (10.7). Furthermore, GRAbB is available as a docker repository: brankovics/grabb (https://hub.docker.com/r/brankovics/grabb/).
Tran, Linh; Yiannoutsos, Constantin T.; Musick, Beverly S.; Wools-Kaloustian, Kara K.; Siika, Abraham; Kimaiyo, Sylvester; van der Laan, Mark J.; Petersen, Maya
2017-01-01
In conducting studies on an exposure of interest, a systematic roadmap should be applied for translating causal questions into statistical analyses and interpreting the results. In this paper we describe an application of one such roadmap applied to estimating the joint effect of both time to availability of a nurse-based triage system (low risk express care (LREC)) and individual enrollment in the program among HIV patients in East Africa. Our study population is comprised of 16,513 subjects found eligible for this task-shifting program within 15 clinics in Kenya between 2006 and 2009, with each clinic starting the LREC program between 2007 and 2008. After discretizing follow-up into 90-day time intervals, we targeted the population mean counterfactual outcome (i. e. counterfactual probability of either dying or being lost to follow up) at up to 450 days after initial LREC eligibility under three fixed treatment interventions. These were (i) under no program availability during the entire follow-up, (ii) under immediate program availability at initial eligibility, but non-enrollment during the entire follow-up, and (iii) under immediate program availability and enrollment at initial eligibility. We further estimated the controlled direct effect of immediate program availability compared to no program availability, under a hypothetical intervention to prevent individual enrollment in the program. Targeted minimum loss-based estimation was used to estimate the mean outcome, while Super Learning was implemented to estimate the required nuisance parameters. Analyses were conducted with the ltmle R package; analysis code is available at an online repository as an R package. Results showed that at 450 days, the probability of in-care survival for subjects with immediate availability and enrollment was 0.93 (95% CI: 0.91, 0.95) and 0.87 (95% CI: 0.86, 0.87) for subjects with immediate availability never enrolling. For subjects without LREC availability, it was 0.91 (95% CI: 0.90, 0.92). Immediate program availability without individual enrollment, compared to no program availability, was estimated to slightly albeit significantly decrease survival by 4% (95% CI 0.03,0.06, p<0.01). Immediately availability and enrollment resulted in a 7 % higher in-care survival compared to immediate availability with non-enrollment after 450 days (95% CI−0.08,−0.05, p<0.01). The results are consistent with a fairly small impact of both availability and enrollment in the LREC program on incare survival. PMID:28736692
Preservation of Earth Science Data History with Digital Content Repository Technology
NASA Astrophysics Data System (ADS)
Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.
2011-12-01
An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.
Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter
2018-02-01
Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.
Permanent Disposal of Nuclear Waste in Salt
NASA Astrophysics Data System (ADS)
Hansen, F. D.
2016-12-01
Salt formations hold promise for eternal removal of nuclear waste from our biosphere. Germany and the United States have ample salt formations for this purpose, ranging from flat-bedded formations to geologically mature dome structures. Both nations are revisiting nuclear waste disposal options, accompanied by extensive collaboration on applied salt repository research, design, and operation. Salt formations provide isolation while geotechnical barriers reestablish impermeability after waste is placed in the geology. Between excavation and closure, physical, mechanical, thermal, chemical, and hydrological processes ensue. Salt response over a range of stress and temperature has been characterized for decades. Research practices employ refined test techniques and controls, which improve parameter assessment for features of the constitutive models. Extraordinary computational capabilities require exacting understanding of laboratory measurements and objective interpretation of modeling results. A repository for heat-generative nuclear waste provides an engineering challenge beyond common experience. Long-term evolution of the underground setting is precluded from direct observation or measurement. Therefore, analogues and modeling predictions are necessary to establish enduring safety functions. A strong case for granular salt reconsolidation and a focused research agenda support salt repository concepts that include safety-by-design. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. Author: F. D. Hansen, Sandia National Laboratories
Case, J.B.; Buesch, D.C.
2004-01-01
Predictions of waste canister and repository driftwall temperatures as functions of space and time are important to evaluate pre-closure performance of the proposed repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. Variations in the lithostratigraphic features in densely welded and crystallized rocks of the 12.8-million-year-old Topopah Spring Tuff, especially the porosity resulting from lithophysal cavities, affect thermal properties. A simulated emplacement drift is based on projecting lithophysal cavity porosity values 50 to 800 m from the Enhanced Characterization of the Repository Block cross drift. Lithophysal cavity porosity varies from 0.00 to 0.05 cm3/cm3 in the middle nonlithophysal zone and from 0.03 to 0.28 cm3/cm3 in the lower lithophysal zone. A ventilation model and computer program titled "Monte Carlo Simulation of Ventilation" (MCSIMVENT), which is based on a composite thermal-pulse calculation, simulates statistical variability and uncertainty of rock-mass thermal properties and ventilation performance along a simulated emplacement drift for a pre-closure period of 50 years. Although ventilation efficiency is relatively insensitive to thermal properties, variations in lithophysal porosity along the drift can result in a range of peak driftwall temperatures can range from 40 to 85??C for the preclosure period. Copyright ?? 2004 by ASME.
NASA Astrophysics Data System (ADS)
Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji
2017-01-01
In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.
Coupled Heat and Moisture Transport Simulation on the Re-saturation of Engineered Clay Barrier
NASA Astrophysics Data System (ADS)
Huang, W. H.; Chuang, Y. F.
2014-12-01
Engineered clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation processes of clay barrier, with emphasis on the coupling effects of heat and moisture during the intrusion of groundwater to the repository. A reference bentonite and a locally available clay were adopted in the laboratory program. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures so as to determine the soil water characteristic curves of the two clays at different temperatures. And water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the clay barrier. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. It was found that soil suction decreases as temperature increases, resulting in a reduction in water retention capability. The finite element method was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on the clays. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. Finally, the model was then used to evaluate the effect of clay barrier thickness on the time required for groundwater to penetrate the clay barrier and approach saturation. Due to the variation in clay suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.
NASA Astrophysics Data System (ADS)
Arko, Robert; Chandler, Cynthia; Stocks, Karen; Smith, Shawn; Clark, Paul; Shepherd, Adam; Moore, Carla; Beaulieu, Stace
2013-04-01
The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from U.S. academic oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. The entire R2R Catalog is published online as a Linked Data collection, making it easily accessible to encourage discovery and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by publishing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation; 2.) We facilitate data citation through the entire lifecycle from field acquisition to shoreside archiving to journal articles and global syntheses, by publishing Digital Object Identifiers (DOIs) for datasets and encoding them directly into our Linked Data resources; and 3.) We facilitate federation with other repositories such as the Biological and Chemical Oceanography Data Management Office (BCO-DMO), InterRidge Vents Database, and Index to Marine and Lacustrine Geological Samples (IMLGS), by reciprocal linking between RDF resources and supporting the RDF Query Language. R2R participates in the Ocean Data Interoperability Platform (ODIP), a joint European-U.S.-Australian partnership to facilitate the sharing of data and documentation across international borders. We publish our controlled vocabularies as a Simple Knowledge Organization System (SKOS) concept collection, and are working toward alignment with SeaDataNet and other community-standard terms using the NERC Vocabulary Server (NVS). http://rvdata.us/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersson, J.
1993-12-31
The Swedish Nuclear Power Inspectorate, SKI, regulatory research program has to prepare for the process of licensing a repository for spent nuclear fuel, by building up the necessary knowledge and review capacity. SKIs main strategy for meeting this demand is to develop an independent performance assessment capability. SKIs first own performance assessment project, Project-90, was completed in 1991 and is now followed by a new project, SITE-94. SITE-94 is based on conclusions reached within Project-90. An independent review of Project-90, carried out by a NEA team of experts, has also contributed to the formation of the project. Another important reasonmore » for the project is that the implementing organization in Sweden, SKB, has proposed to submit an application to start detailed investigation of a repository candidate site around 1997. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process, and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. This will be addressed by exploring multiple interpretations, conceptual models, and parameters consistent with the site data. The site evaluation will strive for consistency between geological, hydrological, rock mechanical, and geochemical descriptions. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties, evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments.« less
EOS-AM precision pointing verification
NASA Technical Reports Server (NTRS)
Throckmorton, A.; Braknis, E.; Bolek, J.
1993-01-01
The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.
Caliver: An R package for CALIbration and VERification of forest fire gridded model outputs.
Vitolo, Claudia; Di Giuseppe, Francesca; D'Andrea, Mirko
2018-01-01
The name caliver stands for CALIbration and VERification of forest fire gridded model outputs. This is a package developed for the R programming language and available under an APACHE-2 license from a public repository. In this paper we describe the functionalities of the package and give examples using publicly available datasets. Fire danger model outputs are taken from the modeling components of the European Forest Fire Information System (EFFIS) and observed burned areas from the Global Fire Emission Database (GFED). Complete documentation, including a vignette, is also available within the package.
Scanning and georeferencing historical USGS quadrangles
Davis, Larry R.; Allord, G.J.
2011-01-01
The USGS Historical Quadrangle Scanning Project (HQSP) is scanning all scales and all editions of approximately 250,000 topographic maps published by the U.S. Geological Survey (USGS) since the inception of the topographic mapping program in 1884. This scanning will provide a comprehensive digital repository of USGS topographic maps, available to the public at no cost. This project serves the dual purpose of creating a master catalog and digital archive copies of the irreplaceable collection of topographic maps in the USGS Reston Map Library as well as making the maps available for viewing and downloading from the USGS Store and The National Map Viewer.
Louisiana: a model for advancing regional e-Research through cyberinfrastructure
Katz, Daniel S.; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D.; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott
2009-01-01
Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date. PMID:19451102
Suran, Jiri; Kovar, Petr; Smoldasova, Jana; Solc, Jaroslav; Van Ammel, Raf; Garcia Miranda, Maria; Russell, Ben; Arnold, Dirk; Zapata-García, Daniel; Boden, Sven; Rogiers, Bart; Sand, Johan; Peräjärvi, Kari; Holm, Philip; Hay, Bruno; Failleau, Guillaume; Plumeri, Stephane; Laurent Beck, Yves; Grisa, Tomas
2018-04-01
Decommissioning of nuclear facilities incurs high costs regarding the accurate characterisation and correct disposal of the decommissioned materials. Therefore, there is a need for the implementation of new and traceable measurement technologies to select the appropriate release or disposal route of radioactive wastes. This paper addresses some of the innovative outcomes of the project "Metrology for Decommissioning Nuclear Facilities" related to mapping of contamination inside nuclear facilities, waste clearance measurement, Raman distributed temperature sensing for long term repository integrity monitoring and validation of radiochemical procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.
Neurogenetics in Child Neurology: Redefining a Discipline in the Twenty-first Century.
Kaufmann, Walter E
2016-12-01
Increasing knowledge on genetic etiology of pediatric neurologic disorders is affecting the practice of the specialty. I reviewed here the history of pediatric neurologic disorder classification and the role of genetics in the process. I also discussed the concept of clinical neurogenetics, with its role in clinical practice, education, and research. Finally, I propose a flexible model for clinical neurogenetics in child neurology in the twenty-first century. In combination with disorder-specific clinical programs, clinical neurogenetics can become a home for complex clinical issues, repository of genetic diagnostic advances, educational resource, and research engine in child neurology.
Caliver: An R package for CALIbration and VERification of forest fire gridded model outputs
Di Giuseppe, Francesca; D’Andrea, Mirko
2018-01-01
The name caliver stands for CALIbration and VERification of forest fire gridded model outputs. This is a package developed for the R programming language and available under an APACHE-2 license from a public repository. In this paper we describe the functionalities of the package and give examples using publicly available datasets. Fire danger model outputs are taken from the modeling components of the European Forest Fire Information System (EFFIS) and observed burned areas from the Global Fire Emission Database (GFED). Complete documentation, including a vignette, is also available within the package. PMID:29293536
Iribarren, Sarah J; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex
2017-05-01
Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and integrate prior user experiences as researchers; 3) evaluate each platform's functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay structures and reported targeted end-users varied. Peer-reviewed publications listed only 6 of the 27 platforms across 21 publications. The majority of these articles reported the name of the platform used but did not describe advantages or disadvantages. Searching for and comparing mHealth platforms for TMIs remains a challenge. The results of this review can serve as a resource for researchers and healthcare professionals wanting to integrate TMIs into health interventions. Steps to identify, compare and assess advantages and disadvantages are outlined for consideration. Expanded evaluation criteria can be used by future researchers. Continued and more comprehensive platform tools should be integrated into mHealth repositories. Detailed descriptions of platform advantages and disadvantages are needed when mHealth researchers publish findings to expand the body of research on TMI tools for healthcare. Standardized descriptions and features are recommended for vendor sites. Copyright © 2017 Elsevier B.V. All rights reserved.
Iribarren, Sarah; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex
2017-01-01
Objectives Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. Methods A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and prior user experiences as researchers; 3) evaluate each platform’s functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Results Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay structures and reported targeted end-users varied. Peer-reviewed publications listed only 6 of the 27 platforms across 21 publications. The majority of these articles reported the name of the platform used but did not describe advantages or disadvantages. Conclusions Searching for and comparing mHealth platforms for TMIs remains a challenge. The results of this review can serve as a resource for researchers and healthcare professionals wanting to integrate TMIs into health interventions. Steps to identify, compare and assess advantages and disadvantages are outlined for consideration. Expanded evaluation criteria can be used by future researchers. Continued and more comprehensive platform tools should be integrated into mHealth repositories. Detailed descriptions of platform advantages and disadvantages are needed when mHealth researchers publish findings to expand the body of research on texting-based tools for healthcare. Standardized descriptions and features are recommended for vendor sites. PMID:28347445
QCDNUM: Fast QCD evolution and convolution
NASA Astrophysics Data System (ADS)
Botje, M.
2011-02-01
The QCDNUM program numerically solves the evolution equations for parton densities and fragmentation functions in perturbative QCD. Un-polarised parton densities can be evolved up to next-to-next-to-leading order in powers of the strong coupling constant, while polarised densities or fragmentation functions can be evolved up to next-to-leading order. Other types of evolution can be accessed by feeding alternative sets of evolution kernels into the program. A versatile convolution engine provides tools to compute parton luminosities, cross-sections in hadron-hadron scattering, and deep inelastic structure functions in the zero-mass scheme or in generalised mass schemes. Input to these calculations are either the QCDNUM evolved densities, or those read in from an external parton density repository. Included in the software distribution are packages to calculate zero-mass structure functions in un-polarised deep inelastic scattering, and heavy flavour contributions to these structure functions in the fixed flavour number scheme. Program summaryProgram title: QCDNUM version: 17.00 Catalogue identifier: AEHV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public Licence No. of lines in distributed program, including test data, etc.: 45 736 No. of bytes in distributed program, including test data, etc.: 911 569 Distribution format: tar.gz Programming language: Fortran-77 Computer: All Operating system: All RAM: Typically 3 Mbytes Classification: 11.5 Nature of problem: Evolution of the strong coupling constant and parton densities, up to next-to-next-to-leading order in perturbative QCD. Computation of observable quantities by Mellin convolution of the evolved densities with partonic cross-sections. Solution method: Parametrisation of the parton densities as linear or quadratic splines on a discrete grid, and evolution of the spline coefficients by solving (coupled) triangular matrix equations with a forward substitution algorithm. Fast computation of convolution integrals as weighted sums of spline coefficients, with weights derived from user-given convolution kernels. Restrictions: Accuracy and speed are determined by the density of the evolution grid. Running time: Less than 10 ms on a 2 GHz Intel Core 2 Duo processor to evolve the gluon density and 12 quark densities at next-to-next-to-leading order over a large kinematic range.
Feasibility study for a transportation operations system cask maintenance facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rennich, M.J.; Medley, L.G.; Attaway, C.R.
The US Department of Energy (DOE), Office of Civilian Radioactive Waste Management (OCRWM) is responsible for the development of a waste management program for the disposition of spent nuclear fuel (SNF) and high-level waste (HLW). The program will include a transportation system for moving the nuclear waste from the sources to a geologic repository for permanent disposal. Specially designed casks will be used to safely transport the waste. The cask systems must be operated within limits imposed by DOE, the Nuclear Regulatory Commission (NRC), and the Department of Transportation (DOT). A dedicated facility for inspecting, testing, and maintaining the caskmore » systems was recommended by the General Accounting Office (in 1979) as the best means of assuring their operational effectiveness and safety, as well as regulatory compliance. In November of 1987, OCRWM requested a feasibility study be made of a Cask Maintenance Facility (CMF) that would perform the required functions. 46 refs., 16 figs., 13 tabs.« less
The Swedish nuclear waste program and the long-term corrosion behaviour of copper
NASA Astrophysics Data System (ADS)
Rosborg, B.; Werme, L.
2008-09-01
The principal strategy for high-level radioactive waste disposal in Sweden is to enclose the spent fuel in tightly sealed copper canisters that are embedded in bentonite clay about 500 m down in the Swedish bedrock. Besides rock movements, the biggest threat to the canister in the repository is corrosion. 'Nature' has proven that copper can last many million of years under proper conditions, bentonite clay has existed for many million years, and the Fennoscandia bedrock shield is stable. The groundwater may not stay the very same over very long periods considering glaciations, but this will not have dramatic consequences for the canister performance. While nature has shown the way, research refines and verifies. The most important task from a corrosion perspective is to ascertain a proper near-field environment. The background and status of the Swedish nuclear waste program are presented together with information about the long-term corrosion behaviour of copper with focus on the oxic period.
Durand, C.T.; Edwards, L.E.; Malinconico, M.L.; Powars, D.S.
2009-01-01
During 2005-2006, the International Continental Scientific Drilling Program and the U.S. Geological Survey drilled three continuous core holes into the Chesapeake Bay impact structure to a total depth of 1766.3 m. A collection of supplemental materials that presents a record of the core recovery and measurement data for the Eyreville cores is available on CD-ROM at the end of this volume and in the GSA Data Repository. The supplemental materials on the CD-ROM include digital photographs of each core box from the three core holes, tables of the three coring-run logs, as recorded on site, and a set of depth-conversion programs. In this chapter, the contents, purposes, and basic applications of the supplemental materials are briefly described. With this information, users can quickly decide if the materials will apply to their specific research needs. ?? 2009 The Geological Society of America.
Roadmap to a Comprehensive Clinical Data Warehouse for Precision Medicine Applications in Oncology
Foran, David J; Chen, Wenjin; Chu, Huiqi; Sadimin, Evita; Loh, Doreen; Riedlinger, Gregory; Goodell, Lauri A; Ganesan, Shridar; Hirshfield, Kim; Rodriguez, Lorna; DiPaola, Robert S
2017-01-01
Leading institutions throughout the country have established Precision Medicine programs to support personalized treatment of patients. A cornerstone for these programs is the establishment of enterprise-wide Clinical Data Warehouses. Working shoulder-to-shoulder, a team of physicians, systems biologists, engineers, and scientists at Rutgers Cancer Institute of New Jersey have designed, developed, and implemented the Warehouse with information originating from data sources, including Electronic Medical Records, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology and Pathology archives, and Next Generation Sequencing services. Innovative solutions were implemented to detect and extract unstructured clinical information that was embedded in paper/text documents, including synoptic pathology reports. Supporting important precision medicine use cases, the growing Warehouse enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information of patient tumors individually or as part of large cohorts to identify changes and patterns that may influence treatment decisions and potential outcomes. PMID:28469389
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
Development of a North American paleoclimate pollen-based reconstruction database application
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Mosher, Steven; Viau, Andre
2013-04-01
Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.
Kleinman, Steven; King, Melissa R; Busch, Michael P; Murphy, Edward L; Glynn, Simone A.
2012-01-01
The Retrovirus Epidemiology Donor Study (REDS), conducted from 1989–2001, and the Retrovirus Epidemiology Donor Study-II (REDS-II), conducted from 2004–2012, were National Heart Lung and Blood Institute (NHLBI) funded multicenter programs focused on improving blood safety and availability in the United States. REDS-II also included international study sites in Brazil and China. The three major research domains of REDS/REDS-II have been infectious disease risk evaluation, blood donation availability, and blood donor characterization. Both programs have made significant contributions to transfusion medicine research methodology by the use of mathematical modeling, large-scale donor surveys, innovative methods of repository sample storage, and establishing an infrastructure that responded to potential emerging blood safety threats such as XMRV. Blood safety studies have included protocols evaluating epidemiologic and/or laboratory aspects of HIV, HTLV I/II, HCV, HBV, WNV, CMV, HHV-8, B19V, malaria, CJD, influenza, and T. cruzi infections. Other analyses have characterized: blood donor demographics, motivations to donate, factors influencing donor return, behavioral risk factors, donors’ perception of the blood donation screening process, and aspects of donor deferral. In REDS-II, two large-scale blood donor protocols examined iron deficiency in donors and the prevalence of leukocyte antibodies. This review describes the major study results from over 150 peer-reviewed articles published by these two REDS programs. In 2011, a new seven year program, the Recipient Epidemiology and Donor Evaluation Study-III (REDS-III), was launched. REDS-III expands beyond donor-based research to include studies of blood transfusion recipients in the hospital setting, and adds a third country, South Africa, to the international program. PMID:22633182
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molecke, M.A.; Sorensen, N.R.; Wicks, G.G.
The three papers in this report were presented at the second international workshop to feature the Waste Isolation Pilot Plant (WIPP) Materials Interface Interactions Test (MIIT). This Workshop on In Situ Tests on Radioactive Waste Forms and Engineered Barriers was held in Corsendonk, Belgium, on October 13--16, 1992, and was sponsored by the Commission of the European Communities (CEC). The Studiecentrum voor Kernenergie/Centre D`Energie Nucleaire (SCK/CEN, Belgium), and the US Department of Energy (via Savannah River) also cosponsored this workshop. Workshop participants from Belgium, France, Germany, Sweden, and the United States gathered to discuss the status, results and overviews ofmore » the MIIT program. Nine of the twenty-five total workshop papers were presented on the status and results from the WIPP MIIT program after the five-year in situ conclusion of the program. The total number of published MIIT papers is now up to almost forty. Posttest laboratory analyses are still in progress at multiple participating laboratories. The first MIIT paper in this document, by Wicks and Molecke, provides an overview of the entire test program and focuses on the waste form samples. The second paper, by Molecke and Wicks, concentrates on technical details and repository relevant observations on the in situ conduct, sampling, and termination operations of the MIIT. The third paper, by Sorensen and Molecke, presents and summarizes the available laboratory, posttest corrosion data and results for all of the candidate waste container or overpack metal specimens included in the MIIT program.« less
The Pisgah Astronomical Research Institute
NASA Astrophysics Data System (ADS)
Cline, J. Donald; Castelaz, M.
2009-01-01
Pisgah Astronomical Research Institute is a not-for-profit foundation located at a former NASA tracking station in the Pisgah National Forest in western North Carolina. PARI is celebrating its 10th year. During its ten years, PARI has developed and implemented innovative science education programs. The science education programs are hands-on experimentally based, mixing disciplines in astronomy, computer science, earth and atmospheric science, engineering, and multimedia. The basic tools for the educational programs include a 4.6-m radio telescope accessible via the Internet, a StarLab planetarium, the Astronomical Photographic Data Archive (APDA), a distributed computing online environment to classify stars called SCOPE, and remotely accessible optical telescopes. The PARI 200 acre campus has a 4.6-m, a 12-m and two 26-m radio telescopes, optical solar telescopes, a Polaris monitoring telescope, 0.4-m and 0.35-m optical research telescopes, and earth and atmospheric science instruments. PARI is also the home of APDA, a repository for astronomical photographic plate collections which will eventually be digitized and made available online. PARI has collaborated with visiting scientists who have developed their research with PARI telescopes and lab facilities. Current experiments include: the Dedicated Interferometer for Rapid Variability (Dennison et al. 2007, Astronomical and Astrophysical Transactions, 26, 557); the Plate Boundary Observatory operated by UNAVCO; the Clemson University Fabry-Perot Interferometers (Meriwether 2008, Journal of Geophysical Research, submitted) measuring high velocity winds and temperatures in the Thermosphere, and the Western Carolina University - PARI variable star program. Current status of the education and research programs and instruments will be presented. Also, development plans will be reviewed. Development plans include the greening of PARI with the installation of solar panels to power the optical telescopes, a new distance learning center, and enhancements to the atmospheric and earth science suite of instrumentation.
The NIH BD2K center for big data in translational genomics.
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; Kent, W James; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van't Veer, Laura; Wold, Barbara; Haussler, David
2015-11-01
The world's genomics data will never be stored in a single repository - rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world's genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM's performance and utility. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Big Data access and infrastructure for modern biology: case studies in data repository utility.
Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R
2017-01-01
Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.
Actinide Solubility and Speciation in the WIPP [PowerPoint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, Donald T.
2015-11-02
The presentation begins with the role and need for nuclear repositories (overall concept, international updates (Sweden, Finland, France, China), US approach and current status), then moves on to the WIPP TRU repository concept (design, current status--safety incidents of February 5 and 14, 2014, path forward), and finally considers the WIPP safety case: dissolved actinide concentrations (overall approach, oxidation state distribution and redox control, solubility of actinides, colloidal contribution and microbial effects). The following conclusions are set forth: (1) International programs are moving forward, but at a very slow and somewhat sporadic pace. (2) In the United States, the Salt repositorymore » concept, from the perspective of the long-term safety case, remains a viable option for nuclear waste management despite the current operational issues/concerns. (3) Current model/PA prediction (WIPP example) are built on redundant conservatisms. These conservatisms are being addressed in the ongoing and future research to fill existing data gaps--redox control of plutonium by Fe(0, II), thorium (analog) solubility studies in simulated brine, contribution of intrinsic and biocolloids to the mobile concentration, and clarification of microbial ecology and effects.« less
UltraPse: A Universal and Extensible Software Platform for Representing Biological Sequences.
Du, Pu-Feng; Zhao, Wei; Miao, Yang-Yang; Wei, Le-Yi; Wang, Likun
2017-11-14
With the avalanche of biological sequences in public databases, one of the most challenging problems in computational biology is to predict their biological functions and cellular attributes. Most of the existing prediction algorithms can only handle fixed-length numerical vectors. Therefore, it is important to be able to represent biological sequences with various lengths using fixed-length numerical vectors. Although several algorithms, as well as software implementations, have been developed to address this problem, these existing programs can only provide a fixed number of representation modes. Every time a new sequence representation mode is developed, a new program will be needed. In this paper, we propose the UltraPse as a universal software platform for this problem. The function of the UltraPse is not only to generate various existing sequence representation modes, but also to simplify all future programming works in developing novel representation modes. The extensibility of UltraPse is particularly enhanced. It allows the users to define their own representation mode, their own physicochemical properties, or even their own types of biological sequences. Moreover, UltraPse is also the fastest software of its kind. The source code package, as well as the executables for both Linux and Windows platforms, can be downloaded from the GitHub repository.
CIRMIS Data system. Volume 2. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deguchi, Akira; Tsuchi, Hiroyuki; Kitayama, Kazumi
2007-07-01
Available in abstract form only. Full text of publication follows: A stepwise site selection process has been adopted for geological disposal of HLW in Japan. Literature surveys (LS), followed by preliminary investigations (PI) and, finally, detailed investigations (DI) in underground facilities will be carried out in the successive selection stages. In the PI stage, surface-based investigations such as borehole surveys and geophysical prospecting will be implemented with two main objectives. The first is to obtain information relating to legal requirements on siting, such as the occurrence of igneous or fault activity, and to confirm the extremely low likelihood of adversemore » impacts on the candidate site resulting from such phenomena. The second is to obtain the information required for the design and performance assessment of the engineered barrier system and the repository. In order to implement these preliminary investigations rigorously and efficiently within the constraints of a limited time period, budget and resources, PI planning before commencing investigations and on-site PI management during the investigation phase are very important issues. The planning and management of PI have to be performed by NUMO staff, but not all staff have sufficient experience in the range of disciplines involved. NUMO therefore decided to compile existing knowledge and experience in the planning and management of investigations in the form of manuals to be used to improve and maintain internal expertise. Experts with experience in overseas investigation programs were requested to prepare these manuals. This paper outlines the structure and scope of the upper level manual (road-map) and discusses NUMO's experience in applying it in 'dry-runs' to model sites. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-10-01
This EIS analyzes the significant environmental impacts that could occur if various technologies for management and disposal of high-level and transuranic wastes from commercial nuclear power reactors were to be developed and implemented. This EIS will serve as the environmental input for the decision on which technology, or technologies, will be emphasized in further research and development activities in the commercial waste management program. The action proposed in this EIS is to (1) adopt a national strategy to develop mined geologic repositories for disposal of commercially generated high-level and transuranic radioactive waste (while continuing to examine subseabed and very deepmore » hole disposal as potential backup technologies) and (2) conduct a R and D program to develop such facilities and the necessary technology to ensure the safe long-term containment and isolation of these wastes. The Department has considered in this statement: development of conventionally mined deep geologic repositories for disposal of spent fuel from nuclear power reactors and/or radioactive fuel reprocessing wastes; balanced development of several alternative disposal methods; and no waste disposal action. This EIS reflects the public review of and comments offered on the draft statement. Included are descriptions of the characteristics of nuclear waste, the alternative disposal methods under consideration, and potential environmental impacts and costs of implementing these methods. Because of the programmatic nature of this document and the preliminary nature of certain design elements assumed in assessing the environmental consequences of the various alternatives, this study has been based on generic, rather than specific, systems. At such time as specific facilities are identified for particular sites, statements addressing site-specific aspects will be prepared for public review and comment.« less
NASA Astrophysics Data System (ADS)
Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.
2014-12-01
The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.
Overview of groundwater quality in the Piceance Basin, western Colorado, 1946--2009
Thomas, J.C.; McMahon, P.B.
2013-01-01
Groundwater-quality data from public and private sources for the period 1946 to 2009 were compiled and put into a common data repository for the Piceance Basin. The data repository is available on the web at http://rmgsc.cr.usgs.gov/cwqdr/Piceance/index.shtml. A subset of groundwater-quality data from the repository was compiled, reviewed, and checked for quality assurance for this report. The resulting dataset consists of the most recently collected sample from 1,545 wells, 1,007 (65 percent) of which were domestic wells. From those samples, the following constituents were selected for presentation in this report: dissolved oxygen, dissolved solids, pH, major ions (chloride, sulfate, fluoride), trace elements (arsenic, barium, iron, manganese, selenium), nitrate, benzene, toluene, ethylbenzene, xylene, methane, and the stable isotopic compositions of water and methane. Some portion of recharge to most of the wells for which data were available was derived from precipitation (most likely snowmelt), as indicated by δ2H [H2O] and δ18O[H2O] values that plot along the Global Meteoric Water Line and near the values for snow samples collected in the study area. Ninety-three percent of the samples were oxic, on the basis of concentrations of dissolved oxygen that were greater than or equal to 0.5 milligrams per liter. Concentration data were compared with primary and secondary drinking-water standards established by the U.S. Environmental Protection Agency. Constituents that exceeded the primary standards were arsenic (13 percent), selenium (9.2 percent), fluoride (8.4 percent), barium (4.1 percent), nitrate (1.6 percent), and benzene (0.6 percent). Concentrations of toluene, xylenes, and ethylbenzene did not exceed standards in any samples. Constituents that exceeded the secondary standard were dissolved solids (72 percent), sulfate (37 percent), manganese (21 percent), iron (16 percent), and chloride (10 percent). Drinking-water standards have not been established for methane, which was detected in 24 percent of samples. Methane concentrations were greater than or equal to 1 milligram per liter in 8.5 percent of samples. Methane isotopic data for samples collected primarily from domestic wells in Garfield County indicate that methane in samples with relative high methane concentrations were derived from both biogenic and thermogenic sources. Many of the constituents that exceeded standards, such as arsenic, fluoride, iron, and manganese, were derived from rock and sediment in aquifers. Elevated nitrate concentrations were most likely derived from human sources such as fertilizer and human or animal waste. Information about the geologic unit or aquifer in which a well was completed generally was not provided by data sources. However, limited data indicate that Quaternary deposits in Garfield and Mesa Counties, the Wasatch Formation in Garfield County, and the Green River Formation in Rio Blanco County had some of the highest median concentrations of selected constituents. Variations in concentration with depth could not be evaluated because of the general lack of well-depth and water-level data. Concentrations of several important constituents, such as arsenic, manganese, methane, and nitrate, were related to concentrations of dissolved oxygen. Concentrations of arsenic, manganese, and methane were significantly higher in groundwater with low dissolved-oxygen concentrations than in groundwater with high dissolved-oxygen concentrations. In contrast, concentrations of nitrate were significantly higher in groundwater with high dissolved-oxygen concentrations than in groundwater with low dissolved-oxygen concentrations. These results indicate that measurements of dissolved oxygen may be a useful indicator of groundwater vulnerability to some human-derived contaminants and enrichment from some natural constituents. Assessing such a large and diverse dataset as the one available through the repository poses unique challenges for reporting on groundwater quality in the study area. The repository contains data from several studies that differed widely in purpose and scope. In addition to this variability in available data, gaps exist spatially, temporally, and analytically in the repository. For example, groundwater-quality data in the repository were not evenly distributed throughout the study area. Several key water-quality constituents or indicators, such as dissolved oxygen, were underrepresented in the repository. Ancillary information, such as well depth, depth to water, and the geologic unit or aquifer in which a well was completed, was missing for more than 50 percent of samples. Future monitoring could avoid several limitations of the repository by making relatively minor changes to sample- collection and data-reporting protocols. Field measurements for dissolved oxygen could be added to sampling protocols, for example. Information on well construction and the geologic unit or aquifer in which a well was completed should be part of the water-quality dataset. Such changes would increase the comparability of data from different monitoring programs and also add value to each program individually and to that of the regional dataset as a whole. Other changes to monitoring programs could require greater resources, such as sampling for a basic set of constituents that is relevant to major water-quality issues in the regional study area. Creation of such a dataset for the regional study area would help to provide the kinds of information needed to characterize background conditions and the spatial and temporal variability in constituent concentrations associated with those conditions. Without such information, it is difficult to identify departures from background that might be associated with human activities.
Ridge 2000 Data Management System
NASA Astrophysics Data System (ADS)
Goodwillie, A. M.; Carbotte, S. M.; Arko, R. A.; Haxby, W. F.; Ryan, W. B.; Chayes, D. N.; Lehnert, K. A.; Shank, T. M.
2005-12-01
Hosted at Lamont by the marine geoscience Data Management group, mgDMS, the NSF-funded Ridge 2000 electronic database, http://www.marine-geo.org/ridge2000/, is a key component of the Ridge 2000 multi-disciplinary program. The database covers each of the three Ridge 2000 Integrated Study Sites: Endeavour Segment, Lau Basin, and 8-11N Segment. It promotes the sharing of information to the broader community, facilitates integration of the suite of information collected at each study site, and enables comparisons between sites. The Ridge 2000 data system provides easy web access to a relational database that is built around a catalogue of cruise metadata. Any web browser can be used to perform a versatile text-based search which returns basic cruise and submersible dive information, sample and data inventories, navigation, and other relevant metadata such as shipboard personnel and links to NSF program awards. In addition, non-proprietary data files, images, and derived products which are hosted locally or in national repositories, as well as science and technical reports, can be freely downloaded. On the Ridge 2000 database page, our Data Link allows users to search the database using a broad range of parameters including data type, cruise ID, chief scientist, geographical location. The first Ridge 2000 field programs sailed in 2004 and, in addition to numerous data sets collected prior to the Ridge 2000 program, the database currently contains information on fifteen Ridge 2000-funded cruises and almost sixty Alvin dives. Track lines can be viewed using a recently- implemented Web Map Service button labelled Map View. The Ridge 2000 database is fully integrated with databases hosted by the mgDMS group for MARGINS and the Antarctic multibeam and seismic reflection data initiatives. Links are provided to partner databases including PetDB, SIOExplorer, and the ODP Janus system. Improved inter-operability with existing and new partner repositories continues to be strengthened. One major effort involves the gradual unification of the metadata across these partner databases. Standardised electronic metadata forms that can be filled in at sea are available from our web site. Interactive map-based exploration and visualisation of the Ridge 2000 database is provided by GeoMapApp, a freely-available Java(tm) application being developed within the mgDMS group. GeoMapApp includes high-resolution bathymetric grids for the 8-11N EPR segment and allows customised maps and grids for any of the Ridge 2000 ISS to be created. Vent and instrument locations can be plotted and saved as images, and Alvin dive photos are also available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pokhitonov, Y.; Kelley, D.
Large amounts of liquid radioactive waste have existed in the U.S. and Russia since the 1950's as a result of the Cold War. Comprehensive action to treat and dispose of waste products has been lacking due to insufficient funding, ineffective technologies or no proven technologies, low priority by governments among others. Today the U.S. and Russian governments seek new, more reliable methods to treat liquid waste, in particular the legacy waste streams. A primary objective of waste generators and regulators is to find economical and proven technologies that can provide long-term stability for repository storage. In 2001, the V.G. Khlopinmore » Radium Institute (Khlopin), St. Petersburg, Russia, and Pacific Nuclear Solutions (PNS), Indianapolis, Indiana, began extensive research and test programs to determine the validity of polymer technology for the absorption and immobilization of standard and complex waste streams. Over 60 liquid compositions have been tested including extensive irradiation tests to verify polymer stability and possible degradation. With conclusive scientific evidence of the polymer's effectiveness in treating liquid waste, both parties have decided to enter the Russian market and offer the solidification technology to nuclear sites for waste treatment and disposal. In conjunction with these efforts, the U.S. Department of Energy (DOE) will join Khlopin and PNS to explore opportunities for direct application of the polymers at predetermined sites and to conduct research for new product development. Under DOE's 'Initiatives for Proliferation Prevention'(IPP) program, funding will be provided to the Russian participants over a three year period to implement the program plan. This paper will present details of U.S. DOE's IPP program, the project structure and its objectives both short and long-term, training programs for scientists, polymer tests and applications for LLW, ILW and HLW, and new product development initiatives. (authors)« less
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.
Using Controlled Vocabularies and Semantics to Improve Ocean Data Discovery (Invited)
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Groman, R. C.; Shepherd, A.; Allison, M. D.; Kinkade, D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.
2013-12-01
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006, by combining the formerly independent data management offices for the U.S. GLOBal Ocean ECosystems Dynamics (GLOBEC) and U.S. Joint Global Ocean flux Study (JGOFS) programs. BCO-DMO staff members work with investigators to publish data from research projects funded by the NSF Geosciences Directorate (GEO) Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and Polar Programs (PLR) Antarctic Sciences Organisms & Ecosystems Program (ANT). Since 2006, researchers have been contributing new data to the BCO-DMO data system. As the data from new research efforts have been added to the data previously shared by U.S. GLOBEC and U.S. JGOFS researchers, the BCO-DMO system has developed into a rich repository of data from ocean, coastal, and Great Lakes research programs. The metadata records for the original research program data (prior to 2006) were stored in human-readable flat files of text, translated on-demand to Web-retrievable files. Beginning in 2006, the metadata records from multiple data systems managed by BCO-DMO were ingested into a relational database (MySQL). Since that time, efforts have been made to incorporate lists of controlled vocabulary terms for key information concepts stored in the MySQL database (e.g. names of research programs, deployments, instruments and measurements). This presents a challenge for a data system that includes legacy data and is continually expanding with the addition of new contributions. Over the years, BCO-DMO has developed a series of data delivery systems driven by the supporting metadata. Improved access to research data, a primary goal of the BCO-DMO project, is achieved through geospatial and text-based data access systems that support data discovery, access, display, assessment, integration, and export of data resources. The addition of a semantically-enabled search capability improves data discovery options particularly for those investigators whose research interests are cross-domain and multi-disciplinary. Current efforts by BCO-DMO staff members are focused on identifying globally unique, persistent identifiers to unambiguously identify resources of interest curated by and available from BCO-DMO. The process involves several essential components: (1) identifying a trusted authoritative source of complementary content and the appropriate contact; (2) determining the globally unique, persistent identifier system for resources of interest and (3) negotiating the requisite syntactic and semantic exchange systems. A variety of technologies have been deployed including: (1) controlled vocabulary term lists for some of the essential concepts/classes; (2) the Ocean Data Ontology; (3) publishing content as Linked Open Data and (4) SPARQL queries and inference. The final results are emerging as a semantic layer comprising domain-specific controlled vocabularies typed to community standard definitions, an ontology with the concepts and relationships needed to describe ocean data, a semantically-enabled faceted search, and inferencing services. We are exploring use of these technologies to improve the accuracy of the BCO-DMO data collection and to facilitate exchange of information with complementary ocean data repositories. Integrating a semantic layer into the BCO-DMO data system architecture improves data and information resource discovery, access and integration.
The visualization and availability of experimental research data at Elsevier
NASA Astrophysics Data System (ADS)
Keall, Bethan
2014-05-01
In the digital age, the visualization and availability of experimental research data is an increasingly prominent aspect of the research process and of the scientific output that researchers generate. We expect that the importance of data will continue to grow, driven by technological advancements, requirements from funding bodies to make research data available, and a developing research data infrastructure that is supported by data repositories, science publishers, and other stakeholders. Elsevier is actively contributing to these efforts, for example by setting up bidirectional links between online articles on ScienceDirect and relevant data sets on trusted data repositories. A key aspect of Elsevier's "Article of the Future" program, these links enrich the online article and make it easier for researchers to find relevant data and articles and help place data in the right context for re-use. Recently, we have set up such links with some of the leading data repositories in Earth Sciences, including the British Geological Survey, Integrated Earth Data Applications, the UK Natural Environment Research Council, and the Oak Ridge National Laboratory DAAC. Building on these links, Elsevier has also developed a number of data integration and visualization tools, such as an interactive map viewer that displays the locations of relevant data from PANGAEA next to articles on ScienceDirect. In this presentation we will give an overview of these and other capabilities of the Article of the Future, focusing on how they help advance communication of research in the digital age.
Krypton-81 in groundwater of the Culebra Dolomite near the Waste Isolation Pilot Plant, New Mexico
NASA Astrophysics Data System (ADS)
Sturchio, Neil C.; Kuhlman, Kristopher L.; Yokochi, Reika; Probst, Peter C.; Jiang, Wei; Lu, Zheng-Tian; Mueller, Peter; Yang, Guo-Min
2014-05-01
The Waste Isolation Pilot Plant (WIPP) in New Mexico is the first geologic repository for disposal of transuranic nuclear waste from defense-related programs of the US Department of Energy. It is constructed within halite beds of the Permian-age Salado Formation. The Culebra Dolomite, confined within Rustler Formation evaporites overlying the Salado Formation, is a potential pathway for radionuclide transport from the repository to the accessible environment in the human-disturbed repository scenario. Although extensive subsurface characterization and numerical flow modeling of groundwater has been done in the vicinity of the WIPP, few studies have used natural isotopic tracers to validate the flow models and to better understand solute transport at this site. The advent of Atom-Trap Trace Analysis (ATTA) has enabled routine measurement of cosmogenic 81Kr (half-life 229,000 yr), a near-ideal tracer for long-term groundwater transport. We measured 81Kr in saline groundwater sampled from two Culebra Dolomite monitoring wells near the WIPP site, and compared 81Kr model ages with reverse particle-tracking results of well-calibrated flow models. The 81Kr model ages are ~ 130,000 and ~ 330,000 yr for high-transmissivity and low-transmissivity portions of the formation, respectively. Compared with flow model results which indicate a relatively young mean hydraulic age (~ 32,000 yr), the 81Kr model ages imply substantial physical attenuation of conservative solutes in the Culebra Dolomite and provide limits on the effective diffusivity of contaminants into the confining aquitards.
Goss, Elizabeth; Link, Michael P; Bruinooge, Suanna S; Lawrence, Theodore S; Tepper, Joel E; Runowicz, Carolyn D; Schilsky, Richard L
2009-08-20
The American Society of Clinical Oncology (ASCO) Cancer Research Committee designed a qualitative research project to assess the attitudes of cancer researchers and compliance officials regarding compliance with the US Privacy Rule and to identify potential strategies for eliminating perceived or real barriers to achieving compliance. A team of three interviewers asked 27 individuals (13 investigators and 14 compliance officials) from 13 institutions to describe the anticipated approach of their institutions to Privacy Rule compliance in three hypothetical research studies. The interviews revealed that although researchers and compliance officials share the view that patients' cancer diagnoses should enjoy a high level of privacy protection, there are significant tensions between the two groups related to the proper standards for compliance necessary to protect patients. The disagreements are seen most clearly with regard to the appropriate definition of a "future research use" of protected health information in biospecimen and data repositories and the standards for a waiver of authorization for disclosure and use of such data. ASCO believes that disagreements related to compliance and the resulting delays in certain projects and abandonment of others might be eased by additional institutional training programs and consultation on Privacy Rule issues during study design. ASCO also proposes the development of best practices documents to guide 1) creation of data repositories, 2) disclosure and use of data from such repositories, and 3) the design of survivorship and genetics studies.
Performance Assessment of a Generic Repository in Bedded Salt for DOE-Managed Nuclear Waste
NASA Astrophysics Data System (ADS)
Stein, E. R.; Sevougian, S. D.; Hammond, G. E.; Frederick, J. M.; Mariner, P. E.
2016-12-01
A mined repository in salt is one of the concepts under consideration for disposal of DOE-managed defense-related spent nuclear fuel (SNF) and high level waste (HLW). Bedded salt is a favorable medium for disposal of nuclear waste due to its low permeability, high thermal conductivity, and ability to self-heal. Sandia's Generic Disposal System Analysis framework is used to assess the ability of a generic repository in bedded salt to isolate radionuclides from the biosphere. The performance assessment considers multiple waste types of varying thermal load and radionuclide inventory, the engineered barrier system comprising the waste packages, backfill, and emplacement drifts, and the natural barrier system formed by a bedded salt deposit and the overlying sedimentary sequence (including an aquifer). The model simulates disposal of nearly the entire inventory of DOE-managed, defense-related SNF (excluding Naval SNF) and HLW in a half-symmetry domain containing approximately 6 million grid cells. Grid refinement captures the detail of 25,200 individual waste packages in 180 disposal panels, associated access halls, and 4 shafts connecting the land surface to the repository. Equations describing coupled heat and fluid flow and reactive transport are solved numerically with PFLOTRAN, a massively parallel flow and transport code. Simulated processes include heat conduction and convection, waste package failure, waste form dissolution, radioactive decay and ingrowth, sorption, solubility limits, advection, dispersion, and diffusion. Simulations are run to 1 million years, and radionuclide concentrations are observed within an aquifer at a point approximately 4 kilometers downgradient of the repository. The software package DAKOTA is used to sample likely ranges of input parameters including waste form dissolution rates and properties of engineered and natural materials in order to quantify uncertainty in predicted concentrations and sensitivity to input parameters. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Stuckless, J. S.
2003-04-01
Natural analogues can contribute to understanding and predicting the performance of subsystems and processes affecting a mined geologic repository for high-level radioactive waste in several ways. Most importantly, analogues provide tests for various aspects of systems of a repository at dimensional scales and time spans that cannot be attained by experimental study. In addition, they provide a means for the general public to judge the predicted performance of a potential high-level nuclear waste repository in familiar terms such that the average person can assess the anticipated long-term performance and other scientific conclusions. Hydrologists working on the Yucca Mountain Project (currently the U.S. Department of Energy's Office of Repository Development) have modeled the flow of water through the vadose zone at Yucca Mountain, Nevada and particularly the interaction of vadose-zone water with mined openings. Analogues from both natural and anthropogenic examples confirm the prediction that most of the water moving through the vadose zone will move through the host rock and around tunnels. This can be seen both quantitatively where direct comparison between seepage and net infiltration has been made and qualitatively by the excellent degree of preservation of archaeologic artifacts in underground openings. The latter include Paleolithic cave paintings in southwestern Europe, murals and artifacts in Egyptian tombs, painted subterranean Buddhist temples in India and China, and painted underground churches in Cappadocia, Turkey. Natural analogues also suggest that this diversion mechanism is more effective in porous media than in fractured media. Observations from natural analogues are also consistent with the modeled decrease in the percentage of infiltration that becomes seepage with a decrease in amount of infiltration. Finally, analogues, such as tombs that have ben partially filled by mud flows, suggest that the same capillary forces that keep water in the rock around underground openings will draw water towards buried waste packages if they are encased in backfill. Analogue work in support of the U.S. repository program continues in the U.S. Geological Survey, in cooperation with the U.S. Department of Energy.
Implementation of an OAIS Repository Using Free, Open Source Software
NASA Astrophysics Data System (ADS)
Flathers, E.; Gessler, P. E.; Seamon, E.
2015-12-01
The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design of the repository, based upon open standards to support interoperability with other institutions' systems and with future versions of our own software components. We also describe the implementation process, including our use of GitHub as a collaboration tool and code repository.
Rolling Deck to Repository (R2R): Linking and Integrating Data for Oceanographic Research
NASA Astrophysics Data System (ADS)
Arko, R. A.; Chandler, C. L.; Clark, P. D.; Shepherd, A.; Moore, C.
2012-12-01
The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from NSF-supported oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. We have published the entire R2R Catalog as a Linked Data collection, making it easily accessible to encourage linking and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by providing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation. We are leveraging or adopting existing community-standard concepts and vocabularies, particularly concepts from the Biological and Chemical Oceanography Data Management Office (BCO-DMO) ontology and terms from the pan-European SeaDataNet vocabularies, and continually re-publish resources as new concepts and terms are mapped. 2.) We facilitate data citation through the entire data lifecycle from field acquisition to shoreside archiving to (ultimately) global syntheses and journal articles. We are implementing globally unique and persistent identifiers at the collection, dataset, and granule levels, and encoding these citable identifiers directly into the Linked Data resources. 3.) We facilitate linking and integration with other repositories that publish Linked Data collections for the U.S. academic fleet, such as BCO-DMO and the Index to Marine and Lacustrine Geological Samples (IMLGS). We are initially mapping datasets at the resource level, and plan to eventually implement rule-based mapping at the concept level. We work collaboratively with partner repositories to develop best practices for URI patterns and consensus on shared vocabularies. The R2R Linked Data collection is implemented as a lightweight "virtual RDF graph" generated on-the-fly from our SQL database using the D2RQ (http://d2rq.org) package. In addition to the default SPARQL endpoint for programmatic access, we are developing a Web-based interface from open-source software components that offers user-friendly browse and search.
Rolling Deck to Repository (R2R): Products and Services for the U.S. Research Fleet Community
NASA Astrophysics Data System (ADS)
Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.
2016-02-01
The Rolling Deck to Repository (R2R) program is working to ensure open access to environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 TB/year of data to R2R from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R ensures these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R has recently expanded to include the vessels Sikuliaq, operated by the University of Alaska; Falkor, operated by the Schmidt Ocean Institute; and Ronald H. Brown and Okeanos Explorer, operated by NOAA. R2R maintains a master catalog of U.S. research cruises, currently holding over 4,670 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. Standard post-field cruise products are published including shiptrack navigation, near-real-time MET/TSG data, underway geophysical profiles, and CTD profiles. Software tools available to users include the R2R Event Logger and the R2R Nav Manager. A Digital Object Identifier (DOI) is published for each cruise, original field sensor dataset, standard post-field product, and document (e.g. cruise report) submitted by the science party. Scientists are linked to personal identifiers such as ORCIDs where available. Using standard identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. R2R collaborates in the Ocean Data Interoperability Platform (ODIP) to strengthen links among regional and national data systems, populates U.S. cruises in the POGO global catalog, and is working toward membership in the DataONE alliance. It is a lead partner in the EarthCube GeoLink project, developing Semantic Web technologies to share data and documentation between repositories, and in the newly-launched EarthCube SeaView project, delivering data from R2R and other ocean data facilities to scientists using the Ocean Data View (ODV) software tool.
Tracking Research Data Footprints via Integration with Research Graph
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Wang, J.; Aryani, A.; Conlon, M.; Wyborn, L. A.; Choudhury, S. A.
2017-12-01
The researcher of today is likely to be part of a team that will use subsets of data from at least one, if not more external repositories, and that same data could be used by multiple researchers for many different purposes. At best, the repositories that host this data will know who is accessing their data, but rarely what they are using it for, resulting in funders of data collecting programs and data repositories that store the data unlikely to know: 1) which research funding contributed to the collection and preservation of a dataset, and 2) which data contributed to high impact research and publications. In days of funding shortages there is a growing need to be able to trace the footprint a data set from the originator that collected the data to the repository that stores the data and ultimately to any derived publications. The Research Data Alliance's Data Description Registry Interoperability Working Group (DDRIWG) has addressed this problem through the development of a distributed graph, called Research Graph that can map each piece of the research interaction puzzle by building aggregated graphs. It can connect datasets on the basis of co-authorship or other collaboration models such as joint funding and grants and can connect research datasets, publications, grants and researcher profiles across research repositories and infrastructures such as DataCite and ORCID. National Computational Infrastructure (NCI) in Australia is one of the early adopters of Research Graph. The graphic view and quantitative analysis helps NCI track the usage of their National reference data collections thus quantifying the role that these NCI-hosted data assets play within the funding-researcher-data-publication-cycle. The graph can unlock the complex interactions of the research projects by tracking the contribution of datasets, the various funding bodies and the downstream data users. RMap Project is a similar initiative which aims to solve complex relationships among scholarly publications and their underlying data, including IEEE publications. It is hoped to combine RMap and Research Graph in the near futures and also to add physical samples to Research Graph.
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. Marr
2006-10-25
The purpose of this calculation is to evaluate the thermal performance of the Naval Long and Naval Short spent nuclear fuel (SNF) waste packages (WP) in the repository emplacement drift. The scope of this calculation is limited to the determination of the temperature profiles upon the surfaces of the Naval Long and Short SNF waste package for up to 10,000 years of emplacement. The temperatures on the top of the outside surface of the naval canister are the thermal interfaces for the Naval Nuclear Propulsion Program (NNPP). The results of this calculation are intended to support Licensing Application design activities.
Identification of a selective small molecule inhibitor of breast cancer stem cells.
Germain, Andrew R; Carmody, Leigh C; Morgan, Barbara; Fernandez, Cristina; Forbeck, Erin; Lewis, Timothy A; Nag, Partha P; Ting, Amal; VerPlank, Lynn; Feng, Yuxiong; Perez, Jose R; Dandapani, Sivaraman; Palmer, Michelle; Lander, Eric S; Gupta, Piyush B; Schreiber, Stuart L; Munoz, Benito
2012-05-15
A high-throughput screen (HTS) with the National Institute of Health-Molecular Libraries Small Molecule Repository (NIH-MLSMR) compound collection identified a class of acyl hydrazones to be selectively lethal to breast cancer stem cell (CSC) enriched populations. Medicinal chemistry efforts were undertaken to optimize potency and selectivity of this class of compounds. The optimized compound was declared as a probe (ML239) with the NIH Molecular Libraries Program and displayed greater than 20-fold selective inhibition of the breast CSC-like cell line (HMLE_sh_Ecad) over the isogenic control line (HMLE_sh_GFP). Copyright © 2012 Elsevier Ltd. All rights reserved.
Wynden, Rob; Anderson, Nick; Casale, Marco; Lakshminarayanan, Prakash; Anderson, Kent; Prosser, Justin; Errecart, Larry; Livshits, Alice; Thimman, Tim; Weiner, Mark
2011-01-01
Within the CTSA (Clinical Translational Sciences Awards) program, academic medical centers are tasked with the storage of clinical formulary data within an Integrated Data Repository (IDR) and the subsequent exposure of that data over grid computing environments for hypothesis generation and cohort selection. Formulary data collected over long periods of time across multiple institutions requires normalization of terms before those data sets can be aggregated and compared. This paper sets forth a solution to the challenge of generating derived aggregated normalized views from large, distributed data sets of clinical formulary data intended for re-use within clinical translational research.
Toward the Development of a Sustainable Scientific Research Culture in Azerbaijan (2011-2015).
Aliyeva, Saida; Flanagan, Peter; Johnson, April; Strelow, Lisa
2016-01-01
This review especially describes the dangerous pathogens research program in Azerbaijan (AJ) funded by the US Defense Threat Reduction Agency under the Cooperative Biological Engagement Program (CBEP) from 2011 through 2015. The objectives of the CBEP are to prevent the proliferation of biological weapons; to consolidate and secure collections of dangerous pathogens in central repositories; to strengthen biosafety and biosecurity of laboratory facilities; and to improve partner nations' ability to detect, diagnose, report, and respond to outbreaks of disease caused by especially dangerous pathogens. One of the missions of the CBEP is therefore to increase the research skills and proficiency of partner country scientists. The program aims to fulfill this mission by sponsoring scientific research projects that exercise the modern diagnostic techniques available in the CBEP-engaged laboratories and the enhanced disease surveillance/control programs. To strengthen the local scientists' ability to develop research ideas, write grant proposals, and conduct research independently, in-country CBEP integrating contractor personnel have mentored scientists across AJ and conducted workshops to address technical gaps. As a result of CBEP engagement, seven research projects developed and led by AJ scientists have been funded, and five projects are currently in various stages of implementation. The Defense Threat Reduction Agency has also sponsored AJ scientist participation at international scientific conferences to introduce and integrate them into the global scientific community. The efforts summarized in this review represent the first steps in an ongoing process that will ultimately provide AJ scientists with the skills and resources to plan and implement research projects of local and regional relevance.
Reuse: A knowledge-based approach
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui
1992-01-01
This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.
Walsh, Stephen Joseph; Meador, Michael R.
1998-01-01
Fish community structure is characterized by the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) Program as part of a perennial, multidisciplinary approach to evaluating the physical, chemical, and biological conditions of the Nation's water resources. The objective of quality assurance and quality control of fish taxonomic data that are collected as part of the NAWQA Program is to establish uniform guidelines and protocols for the identification, processing, and archiving of fish specimens to ensure that accurate and reliable data are collected. Study unit biologists, collaborating with regional biologists and fish taxonomic specialists, prepare a pre-sampling study plan that includes a preliminary faunal list and identification of an ichthyological curation center for receiving preserved fish specimens. Problematic taxonomic issues and protected taxa also are identified in the study plan, and collecting permits are obtained in advance of sampling activities. Taxonomic specialists are selected to identify fish specimens in the field and to assist in determining what fish specimens should be sacrificed, fixed, and preserved for laboratory identification, independent taxonomic verification, and long-term storage in reference or voucher collections. Quantitative and qualitative sampling of fishes follows standard methods previously established for the NAWQA Program. Common ichthyological techniques are used to process samples in the field and prepare fish specimens to be returned to the laboratory or sent to an institutional repository. Taxonomic identifications are reported by using a standardized list of scientific names that provides nomenclatural consistency and uniformity across study units.
Oceanographic Research Capacity in the US Virgin Islands
NASA Astrophysics Data System (ADS)
Jobsis, P.; Habtes, S. Y.
2016-02-01
The University of the Virgin Islands (UVI), a small HBCU with campuses on both St Thomas and St Croix, has a growing marine science department that is quickly increasing its capacity for oceanographic monitoring and research due to VI-EPSCoR (National Science Foundation's Experimental Program to Stimulate Competitive Research in the Virgin Islands) and associations with CariCOOS (the Caribbean Coastal Ocean Observing System). CariCOOS is managed through the University of Puerto Rico Mayaguez, with funding from NOAA's Integrated Ocean Observing System (IOOS). Over the past five years two oceanographic data buoys have been deployed increasing the real-time oceanographic data available for the northeastern Caribbean. In addition, researchers at UVI have deployed ADCPs and conducted CTD casts at relevant research sites as part of routine territorial monitoring programs. With VI-EPSCoR funding UVI has developed an Institute for Geocomputational Analysis and Statistic (GeoCAS) to conduct geospatial analysis and to act as a data repository and hosting/serving center for research, environmental and other relevant data. Much of the oceanographic data is available at www.caricoos.org and www.geocas.uvi.edu. As the marine research infrastructure at UVI continues to grow, the oceanographic and marine biology research program at the University's Center for Marine and Environmental Studies will continue to expand. This will benefit not only UVI researchers but also any researcher with interests in this region of the Caribbean.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-08-01
During the first half of fiscal year 1995, most activities at the Yucca Mountain Site Characterization Project were directed at implementing the Program Plan developed by the Office of Civilian Radioactive Waste Management. The Plan is designed to enable the Office to make measurable and significant progress toward key objectives over the next five years within the financial resources that can be realistically expected. Activities this period focused on the immediate goal of determining by 1998 whether Yucca Mountain, Nevada, is technically suitable as a possible site for a geologic repository for the permanent disposal of spent nuclear fuel andmore » high-level radioactive waste. Work on the Project advanced in several critical areas, including programmatic activities such as issuing the Program Plan, completing the first technical basis report to support the assessment of three 10 CFR 960 guidelines, developing the Notice of Intent for the Environmental Impact Statement, submitting the License Application Annotated Outline, and beginning a rebaselining effort to conform with the goals of the Program Plan. Scientific investigation and analysis of the site and design and construction activities to support the evaluation of the technical suitability of the site also advanced. Specific details relating to all Project activities and reports generated are presented in this report.« less
ACToR Chemical Structure processing using Open Source ...
ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d
Concentrations of indoor pollutants (CIP) database user's manual (Version 4. 0)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apte, M.G.; Brown, S.R.; Corradi, C.A.
1990-10-01
This is the latest release of the database and the user manual. The user manual is a tutorial and reference for utilizing the CIP Database system. An installation guide is included to cover various hardware configurations. Numerous examples and explanations of the dialogue between the user and the database program are provided. It is hoped that this resource will, along with on-line help and the menu-driven software, make for a quick and easy learning curve. For the purposes of this manual, it is assumed that the user is acquainted with the goals of the CIP Database, which are: (1) tomore » collect existing measurements of concentrations of indoor air pollutants in a user-oriented database and (2) to provide a repository of references citing measured field results openly accessible to a wide audience of researchers, policy makers, and others interested in the issues of indoor air quality. The database software, as distinct from the data, is contained in two files, CIP. EXE and PFIL.COM. CIP.EXE is made up of a number of programs written in dBase III command code and compiled using Clipper into a single, executable file. PFIL.COM is a program written in Turbo Pascal that handles the output of summary text files and is called from CIP.EXE. Version 4.0 of the CIP Database is current through March 1990.« less
DNA linkage studies of degenerative retinal diseases.
Daiger, S P; Heckenlively, J R; Lewis, R A; Pelias, M Z
1987-01-01
DNA linkage studies of human genetic diseases have led to rapid characterization of a number of otherwise intractable disease loci. Detection of a linked DNA marker, the first step in "reverse genetics", has permitted cloning of the genes for Duchenne muscular dystrophy, retinoblastoma and chronic granulomatosis disease, among others. Thus, the case for applying these techniques to retinitis pigmentosa and related diseases, and the urgency in capitalizing on molecular developments, is justified and compelling. The first major success regarding RP was in demonstrating linkage of the DNA marker DXS7 (L1.28) to XRP. For autosomal forms of the disease, conventional linkage studies have provided tentative evidence for linkage of ADRP to the Rh blood group on chromosome lp and for linkage of Usher's syndrome to Gc and 4q. These provisional assignments are, at least, an important starting point for DNA analysis. The Support Program for DNA Linkage Studies of Degenerative Retinal Diseases was established to provide access for the scientific community to appropriate families, using the resources of the Human Genetic Mutant Cell Repository to prepare, store and distribute lymphoblast lines. To date, two extensive, well-characterized families are included in the program: the autosomal dominant RP family UCLA-RP01, and the Usher's syndrome families LSU-US01. It is highly likely that rapid progress will be made in mapping and characterizing the inherited retinal dystrophies. We believe the support program will facilitate this progress.
Thornburg, Christopher C; Britt, John R; Evans, Jason R; Akee, Rhone K; Whitt, James A; Trinh, Spencer K; Harris, Matthew J; Thompson, Jerell R; Ewing, Teresa L; Shipley, Suzanne M; Grothaus, Paul G; Newman, David J; Schneider, Joel P; Grkovic, Tanja; O'Keefe, Barry R
2018-06-13
The US National Cancer Institute's (NCI) Natural Product Repository is one of the world's largest, most diverse collections of natural products containing over 230,000 unique extracts derived from plant, marine, and microbial organisms that have been collected from biodiverse regions throughout the world. Importantly, this national resource is available to the research community for the screening of extracts and the isolation of bioactive natural products. However, despite the success of natural products in drug discovery, compatibility issues that make extracts challenging for liquid handling systems, extended timelines that complicate natural product-based drug discovery efforts and the presence of pan-assay interfering compounds have reduced enthusiasm for the high-throughput screening (HTS) of crude natural product extract libraries in targeted assay systems. To address these limitations, the NCI Program for Natural Product Discovery (NPNPD), a newly launched, national program to advance natural product discovery technologies and facilitate the discovery of structurally defined, validated lead molecules ready for translation will create a prefractionated library from over 125,000 natural product extracts with the aim of producing a publicly-accessible, HTS-amenable library of >1,000,000 fractions. This library, representing perhaps the largest accumulation of natural-product based fractions in the world, will be made available free of charge in 384-well plates for screening against all disease states in an effort to reinvigorate natural product-based drug discovery.
NASA Life Sciences Data Repositories: Tools for Retrospective Analysis and Future Planning
NASA Technical Reports Server (NTRS)
Thomas, D.; Wear, M.; VanBaalen, M.; Lee, L.; Fitts, M.
2011-01-01
As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future research studies.
Stuckless, John S.; Levich, Robert A.
2012-01-01
This hydrology and geochemistry volume is a companion volume to the 2007 Geological Society of America Memoir 199, The Geology and Climatology of Yucca Mountain and Vicinity, Southern Nevada and California, edited by Stuckless and Levich. The work in both volumes was originally reported in the U.S. Department of Energy regulatory document Yucca Mountain Site Description, for the site characterization study of Yucca Mountain, Nevada, as the proposed U.S. geologic repository for high-level radioactive waste. The selection of Yucca Mountain resulted from a nationwide search and numerous committee studies during a period of more than 40 yr. The waste, largely from commercial nuclear power reactors and the government's nuclear weapons programs, is characterized by intense penetrating radiation and high heat production, and, therefore, it must be isolated from the biosphere for tens of thousands of years. The extensive, unique, and often innovative geoscience investigations conducted at Yucca Mountain for more than 20 yr make it one of the most thoroughly studied geologic features on Earth. The results of these investigations contribute extensive knowledge to the hydrologic and geochemical aspects of radioactive waste disposal in the unsaturated zone. The science, analyses, and interpretations are important not only to Yucca Mountain, but also to the assessment of other sites or alternative processes that may be considered for waste disposal in the future. Groundwater conditions, processes, and geochemistry, especially in combination with the heat from radionuclide decay, are integral to the ability of a repository to isolate waste. Hydrology and geochemistry are discussed here in chapters on unsaturated zone hydrology, saturated zone hydrology, paleohydrology, hydrochemistry, radionuclide transport, and thermally driven coupled processes affecting long-term waste isolation. This introductory chapter reviews some of the reasons for choosing to study Yucca Mountain as a repository site.
Stuckless, John S.; Levich, Robert A.
2012-01-01
This hydrology and geochemistry volume is a companion volume to the 2007 Geological Society of America Memoir 199, The Geology and Climatology of Yucca Mountain and Vicinity, Southern Nevada and California, edited by Stuckless and Levich. The work in both volumes was originally reported in the U.S. Department of Energy regulatory document Yucca Mountain Site Description, for the site characterization study of Yucca Mountain, Nevada, as the proposed U.S. geologic repository for high-level radioactive waste. The selection of Yucca Mountain resulted from a nationwide search and numerous committee studies during a period of more than 40 yr. The waste, largely from commercial nuclear power reactors and the government's nuclear weapons programs, is characterized by intense penetrating radiation and high heat production, and, therefore, it must be isolated from the biosphere for tens of thousands of years. The extensive, unique, and often innovative geoscience investigations conducted at Yucca Mountain for more than 20 yr make it one of the most thoroughly studied geologic features on Earth. The results of these investigations contribute extensive knowledge to the hydrologic and geochemical aspects of radioactive waste disposal in the unsaturated zone. The science, analyses, and interpretations are important not only to Yucca Mountain, but also to the assessment of other sites or alternative processes that may be considered for waste disposal in the future. Groundwater conditions, processes, and geochemistry, especially in combination with the heat from radionuclide decay, are integral to the ability of a repository to isolate waste. Hydrology and geochemistry are discussed here in chapters on unsaturated zone hydrology, saturated zone hydrology, paleohydrology, hydrochemistry, radionuclide transport, and thermally driven coupled processes affecting long-term waste isolation. This introductory chapter reviews some of the reasons for choosing to study Yucca Mountain as a repository site.
Bouhaddou, Omar; Lincoln, Michael J.; Maulden, Sarah; Murphy, Holli; Warnekar, Pradnya; Nguyen, Viet; Lam, Siew; Brown, Steven H; Frankson, Ferdinand J.; Crandall, Glen; Hughes, Carla; Sigley, Roger; Insley, Marcia; Graham, Gail
2006-01-01
The Veterans Administration (VA) has adopted an ambitious program to standardize its clinical terminology to comply with industry-wide standards. The VA is using commercially available tools and in-house software to create a high-quality reference terminology system. The terminology will be used by current and future applications with no planned disruption to operational systems. The first large customer of the group is the national VA Health Data Repository (HDR). Unique enterprise identifiers are assigned to each standard term, and a rich network of semantic relationships makes the resulting data not only recognizable, but highly computable and reusable in a variety of applications, including decision support and data sharing with partners such as the Department of Defense (DoD). This paper describes the specific methods and approaches that the VA has employed to develop and implement this innovative program in existing information system. The goal is to share with others our experience with key issues that face our industry as we move toward an electronic health record for every individual. PMID:17238306
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
VisRseq: R-based visual framework for analysis of sequencing data
2015-01-01
Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469
VisRseq: R-based visual framework for analysis of sequencing data.
Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M
2015-01-01
Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.
Source and long-term behavior of transuranic aerosols in the WIPP environment.
Thakur, P; Lemons, B G
2016-10-01
Source and long-term behavior transuranic aerosols ((239+240)Pu, (238)Pu, and (241)Am) in the ambient air samples collected at and near the Waste Isolation Pilot Plant (WIPP) deep geologic repository site were investigated using historical data from an independent monitoring program conducted by the Carlsbad Environmental Monitoring and Research Center and an oversight monitoring program conducted by the management and operating contractor for WIPP at and near the facility. An analysis of historical data indicates frequent detections of (239+240)Pu and (241)Am, whereas (238)Pu is detected infrequently. Peaks in (239+240)Pu and (241)Am concentrations in ambient air generally occur from March to June timeframe, which is when strong and gusty winds in the area frequently give rise to blowing dust. Long-term measurements of plutonium isotopes (1985-2015) in the WIPP environment suggest that the resuspension of previously contaminated soils is likely the primary source of plutonium in the ambient air samples from WIPP and its vicinity. There is no evidence that WIPP is a source of environmental contamination that can be considered significant by any health-based standard.
Promoting Academic Physicists, Their Students, and Their Research through Library Partnerships
NASA Astrophysics Data System (ADS)
Rozum, B.; Wesolek, A.
2012-12-01
At many institutions, attracting and mentoring quality students is of key importance. Through their developing careers, typically under the tutelage of one primary faculty member, students build portfolios, prepare for graduate school, and apply to post-doc programs or faculty positions. Often though, the corpus of that primary faculty member's work is not available in a single location. This is a disadvantage both for current students, who wish to highlight the importance of their work within the context of a research group and for the department, which can miss opportunities to attract high-quality future students. Utah State University Libraries hosts a thriving institutional repository, DigitalCommons@USU, which provides open access to scholarly works, research, reports, publications, and journals produced by Utah State University faculty, staff, and students. The Library and the Physics Department developed a partnership to transcend traditional library repository architecture and emphasize faculty research groups within the department. Previously, only student theses and dissertations were collected, and they were not associated with the department in any way. Now student presentations, papers, and posters appear with other faculty works all in the same research work space. This poster session highlights the features of the University's repository and describes what is required to establish a similar structure at other academic institutions. We anticipate several long-term benefits of this new structure. Students are pleased with the increased visibility of their research and with having an online presence through their "Selected Works" personal author site. Faculty are pleased with the opportunity to highlight their research and the potential to attract new students to their research groups. This new repository model also allows the library to amplify the existing scientific outreach initiatives of the physics department. One example of this is a recent exhibit created in the Library showcasing a student research group's 30-year history of sending payloads into space. The exhibit was a direct result of archiving the work of student researchers in the institutional repository. From the perspective of the Library, the benefits are also impressive. The Library is able to build its institutional repository, develop strong relations with faculty in the Physics Department, and have access to unpublished reports that otherwise might be lost. Establishing research groups' presence in DigitalCommons@USU provided an opportunity to meet with the Physics graduate students to discuss setting up online web portfolios, archiving their publications, and understanding publisher contracts. Developing partnerships between academic units and libraries is one more method to reach out to potential students, promote research, and showcase the talents of faculty and students. Using the Library's institutional repository to do this is beneficial for everyone.
Solutions for research data from a publisher's perspective
NASA Astrophysics Data System (ADS)
Cotroneo, P.
2015-12-01
Sharing research data has the potential to make research more efficient and reproducible. Elsevier has developed several initiatives to address the different needs of research data users. These include PANGEA Linked data, which provides geo-referenced, citable datasets from earth and life sciences, archived as supplementary data from publications by the PANGEA data repository; Mendeley Data, which allows users to freely upload and share their data; a database linking program that creates links between articles on ScienceDirect and datasets held in external data repositories such as EarthRef and EarthChem; a pilot for searching for research data through a map interface; an open data pilot that allows authors publishing in Elsevier journals to store and share research data and make this publicly available as a supplementary file alongside their article; and data journals, including Data in Brief, which allow researchers to share their data open access. Through these initiatives, researchers are not only encouraged to share their research data, but also supported in optimizing their research data management. By making data more readily citable and visible, and hence generating citations for authors, these initiatives also aim to ensure that researchers get the recognition they deserve for publishing their data.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Masseroli, M; Bonacina, S; Pinciroli, F
2004-01-01
The actual development of distributed information technologies and Java programming enables employing them also in the medical arena to support the retrieval, integration and evaluation of heterogeneous data and multimodal images in a web browser environment. With this aim, we used them to implement a client-server architecture based on software agents. The client side is a Java applet running in a web browser and providing a friendly medical user interface to browse and visualize different patient and medical test data, integrating them properly. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. Based on the Java Advanced Imaging API, processing and analysis tools were developed to support the evaluation of remotely retrieved bioimages through the quantification of their features in different regions of interest. The Java platform-independence allows the centralized management of the implemented prototype and its deployment to each site where an intranet or internet connection is available. Giving healthcare providers effective support for comprehensively browsing, visualizing and evaluating medical images and records located in different remote repositories, the developed prototype can represent an important aid in providing more efficient diagnoses and medical treatments.
Master Metadata Repository and Metadata-Management System
NASA Technical Reports Server (NTRS)
Armstrong, Edward; Reed, Nate; Zhang, Wen
2007-01-01
A master metadata repository (MMR) software system manages the storage and searching of metadata pertaining to data from national and international satellite sources of the Global Ocean Data Assimilation Experiment (GODAE) High Resolution Sea Surface Temperature Pilot Project [GHRSSTPP]. These sources produce a total of hundreds of data files daily, each file classified as one of more than ten data products representing global sea-surface temperatures. The MMR is a relational database wherein the metadata are divided into granulelevel records [denoted file records (FRs)] for individual satellite files and collection-level records [denoted data set descriptions (DSDs)] that describe metadata common to all the files from a specific data product. FRs and DSDs adhere to the NASA Directory Interchange Format (DIF). The FRs and DSDs are contained in separate subdatabases linked by a common field. The MMR is configured in MySQL database software with custom Practical Extraction and Reporting Language (PERL) programs to validate and ingest the metadata records. The database contents are converted into the Federal Geographic Data Committee (FGDC) standard format by use of the Extensible Markup Language (XML). A Web interface enables users to search for availability of data from all sources.
ProtaBank: A repository for protein design and engineering data.
Wang, Connie Y; Chang, Paul M; Ary, Marie L; Allen, Benjamin D; Chica, Roberto A; Mayo, Stephen L; Olafson, Barry D
2018-03-25
We present ProtaBank, a repository for storing, querying, analyzing, and sharing protein design and engineering data in an actively maintained and updated database. ProtaBank provides a format to describe and compare all types of protein mutational data, spanning a wide range of properties and techniques. It features a user-friendly web interface and programming layer that streamlines data deposition and allows for batch input and queries. The database schema design incorporates a standard format for reporting protein sequences and experimental data that facilitates comparison of results across different data sets. A suite of analysis and visualization tools are provided to facilitate discovery, to guide future designs, and to benchmark and train new predictive tools and algorithms. ProtaBank will provide a valuable resource to the protein engineering community by storing and safeguarding newly generated data, allowing for fast searching and identification of relevant data from the existing literature, and exploring correlations between disparate data sets. ProtaBank invites researchers to contribute data to the database to make it accessible for search and analysis. ProtaBank is available at https://protabank.org. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
Ethics, Human Use, and the Department of Defense Serum Repository.
Pavlin, Julie A; Welch, Robert A
2015-10-01
The Department of Defense Serum Repository (DoDSR) contains a growing archive of sera from service members collected to perform medical surveillance, clinical diagnosis, and epidemiologic studies to identify, prevent, and control diseases associated with military service. The specimens are a mandatory collection under DoD and U.S. regulations and do not include informed consent for uses beyond force health protection. Any use of the specimens for research requires deidentification of the samples and must be approved by Institutional Review Boards. However, as expansion of the DoDSR is contemplated, ethical considerations of sample collection, storage, and use must be carefully reconsidered. Other similar programs for research use of specimens collected for public health purpose are also undergoing similar reviews. It is recommended that at a minimum, service members are informed of the potential storage and use of their specimens and are allowed to opt out of additional use, or a broad informed consent is provided. The DoDSR provides a tremendous resource to the DoD and global health community, and to ensure its continued existence and improvement, the DoD must stay consistent with all principles of research ethics. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Software for Sharing and Management of Information
NASA Technical Reports Server (NTRS)
Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.
2003-01-01
DIAMS is a set of computer programs that implements a system of collaborative agents that serve multiple, geographically distributed users communicating via the Internet. DIAMS provides a user interface as a Java applet that runs on each user s computer and that works within the context of the user s Internet-browser software. DIAMS helps all its users to manage, gain access to, share, and exchange information in databases that they maintain on their computers. One of the DIAMS agents is a personal agent that helps its owner find information most relevant to current needs. It provides software tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Capabilities for generating flexible hierarchical displays are integrated with capabilities for indexed- query searching to support effective access to information. Automatic indexing methods are employed to support users queries and communication between agents. The catalog of a repository is kept in object-oriented storage to facilitate sharing of information. Collaboration between users is aided by matchmaker agents and by automated exchange of information. The matchmaker agents are designed to establish connections between users who have similar interests and expertise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiner, Ruth F.; Blink, James A.; Rechard, Robert Paul
This report examines the current policy, legal, and regulatory framework pertaining to used nuclear fuel and high level waste management in the United States. The goal is to identify potential changes that if made could add flexibility and possibly improve the chances of successfully implementing technical aspects of a nuclear waste policy. Experience suggests that the regulatory framework should be established prior to initiating future repository development. Concerning specifics of the regulatory framework, reasonable expectation as the standard of proof was successfully implemented and could be retained in the future; yet, the current classification system for radioactive waste, including hazardousmore » constituents, warrants reexamination. Whether or not consideration of multiple sites are considered simultaneously in the future, inclusion of mechanisms such as deliberate use of performance assessment to manage site characterization would be wise. Because of experience gained here and abroad, diversity of geologic media is not particularly necessary as a criterion in site selection guidelines for multiple sites. Stepwise development of the repository program that includes flexibility also warrants serious consideration. Furthermore, integration of the waste management system from storage, transportation, and disposition, should be examined and would be facilitated by integration of the legal and regulatory framework. Finally, in order to enhance acceptability of future repository development, the national policy should be cognizant of those policy and technical attributes that enhance initial acceptance, and those policy and technical attributes that maintain and broaden credibility.« less
Goodfellow, L M
2009-06-01
A worldwide repository of electronic theses and dissertations (ETDs) could provide worldwide access to the most up-to-date research generated by masters and doctoral students. Until that international repository is established, it is possible to access some of these valuable knowledge resources. ETDs provide a technologically advanced medium with endless multimedia capabilities that far exceed the print and bound copies of theses and dissertations housed traditionally in individual university libraries. CURRENT USE: A growing trend exists for universities worldwide to require graduate students to submit theses or dissertations as electronic documents. However, nurse scholars underutilize ETDs, as evidenced by perusing bibliographic citation lists in many of the research journals. ETDs can be searched for and retrieved through several digital resources such as the Networked Digital Library of Theses and Dissertations (http://www.ndltd.org), ProQuest Dissertations and Theses (http://www.umi.com), the Australasian Digital Theses Program (http://adt.caul.edu.au/) and through individual university web sites and online catalogues. An international repository of ETDs benefits the community of nurse scholars in many ways. The ability to access recent graduate students' research electronically from anywhere in the world is advantageous. For scholars residing in developing countries, access to these ETDs may prove to be even more valuable. In some cases, ETDs are not available for worldwide access and can only be accessed through the university library from which the student graduated. Public access to university library ETD collections is not always permitted. Nurse scholars from both developing and developed countries could benefit from ETDs.
Krypton-81 in groundwater of the Culebra Dolomite near the Waste Isolation Pilot Plant, New Mexico.
Sturchio, Neil C; Kuhlman, Kristopher L; Yokochi, Reika; Probst, Peter C; Jiang, Wei; Lu, Zheng-Tian; Mueller, Peter; Yang, Guo-Min
2014-05-01
The Waste Isolation Pilot Plant (WIPP) in New Mexico is the first geologic repository for disposal of transuranic nuclear waste from defense-related programs of the US Department of Energy. It is constructed within halite beds of the Permian-age Salado Formation. The Culebra Dolomite, confined within Rustler Formation evaporites overlying the Salado Formation, is a potential pathway for radionuclide transport from the repository to the accessible environment in the human-disturbed repository scenario. Although extensive subsurface characterization and numerical flow modeling of groundwater has been done in the vicinity of the WIPP, few studies have used natural isotopic tracers to validate the flow models and to better understand solute transport at this site. The advent of Atom-Trap Trace Analysis (ATTA) has enabled routine measurement of cosmogenic (81)Kr (half-life 229,000 yr), a near-ideal tracer for long-term groundwater transport. We measured (81)Kr in saline groundwater sampled from two Culebra Dolomite monitoring wells near the WIPP site, and compared (81)Kr model ages with reverse particle-tracking results of well-calibrated flow models. The (81)Kr model ages are ~130,000 and ~330,000 yr for high-transmissivity and low-transmissivity portions of the formation, respectively. Compared with flow model results which indicate a relatively young mean hydraulic age (~32,000 yr), the (81)Kr model ages imply substantial physical attenuation of conservative solutes in the Culebra Dolomite and provide limits on the effective diffusivity of contaminants into the confining aquitards. Copyright © 2014 Elsevier B.V. All rights reserved.
Software support for Huntingtons disease research.
Conneally, P M; Gersting, J M; Gray, J M; Beidleman, K; Wexler, N S; Smith, C L
1991-01-01
Huntingtons disease (HD) is a hereditary disorder involving the central nervous system. Its effects are devastating, to the affected person as well as his family. The Department of Medical and Molecular Genetics at Indiana University (IU) plays an integral part in Huntingtons research by providing computerized repositories of HD family information for researchers and families. The National Huntingtons Disease Research Roster, founded in 1979 at IU, and the Huntingtons Disease in Venezuela Project database contain information that has proven to be invaluable in the worldwide field of HD research. This paper addresses the types of information stored in each database, the pedigree database program (MEGADATS) used to manage the data, and significant findings that have resulted from access to the data.
NASA Technical Reports Server (NTRS)
Krempl, Erhard; Hong, Bor Zen
1989-01-01
A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.
2010-04-01
Children with chronic medical conditions rely on complex management plans for problems that cause them to be at increased risk for suboptimal outcomes in emergency situations. The emergency information form (EIF) is a medical summary that describes medical condition(s), medications, and special health care needs to inform health care providers of a child's special health conditions and needs so that optimal emergency medical care can be provided. This statement describes updates to EIFs, including computerization of the EIF, expanding the potential benefits of the EIF, quality-improvement programs using the EIF, the EIF as a central repository, and facilitating emergency preparedness in disaster management and drills by using the EIF.
1993-07-27
The Food and Drug Administration (FDA) is announcing that it is establishing a public docket for policy speeches, policy statements, and standard operating procedure guides pertaining to product evaluation and regulatory enforcement for its medical device and radiological health programs. The docket will operate on a 1-year trial basis and will serve both as a repository for critical policy documents generated by the Center for Devices and Radiological Health (CDRH) and as a public display mechanism for access by representatives of the industry and other interested persons. This action is one element of an overall communications initiative to ensure uniform and timely access to important information.
Corradi, Luca; Porro, Ivan; Schenone, Andrea; Momeni, Parastoo; Ferrari, Raffaele; Nobili, Flavio; Ferrara, Michela; Arnulfo, Gabriele; Fato, Marco M
2012-10-08
Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of "meta" data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data integration aspects have been addressed by providing the repository application with an efficient dynamic interface designed to enable the user to both easily query the data depending on defined datatypes and view all the data of every patient in an integrated and simple way. The results of our work have been twofold. First, a dynamically extensible data model has been implemented and tested based on a "meta" data-model enabling users to define their own data types independently from the application context. This data model has allowed users to dynamically include additional data types without the need of rebuilding the underlying database. Then a complex process-event data structure has been built, based on this data model, describing patient-centered diagnostic processes and merging information from data and metadata. Second, a repository implementing such a data structure has been deployed on a distributed Data Grid in order to provide scalability both in terms of data input and data storage and to exploit distributed data and computational approaches in order to share resources more efficiently. Moreover, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. Based on such repository, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications.
2012-01-01
Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data integration aspects have been addressed by providing the repository application with an efficient dynamic interface designed to enable the user to both easily query the data depending on defined datatypes and view all the data of every patient in an integrated and simple way. Results The results of our work have been twofold. First, a dynamically extensible data model has been implemented and tested based on a “meta” data-model enabling users to define their own data types independently from the application context. This data model has allowed users to dynamically include additional data types without the need of rebuilding the underlying database. Then a complex process-event data structure has been built, based on this data model, describing patient-centered diagnostic processes and merging information from data and metadata. Second, a repository implementing such a data structure has been deployed on a distributed Data Grid in order to provide scalability both in terms of data input and data storage and to exploit distributed data and computational approaches in order to share resources more efficiently. Moreover, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. Conclusions Based on such repository, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. PMID:23043673
Yucca Mountain: How Do Global and Federal Initiatives Impact Clark County's Nuclear Waste Program?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Navis, I.; McGehee, B.
2008-07-01
Since 1987, Clark County has been designated by the U.S. Department of Energy (DOE) as an 'Affected Unit of Local Government' (AULG). The AULG designation is an acknowledgement by the federal government that activities associated with the Yucca Mountain proposal could result in considerable impacts on Clark County residents and the community as a whole. As an AULG, Clark County is authorized to identify 'any potential economic, social, public health and safety, and environmental impacts of a repository', 42 U.S.C. Section 10135(c)(1)(B)(i) under provisions of the Nuclear Waste Policy Act Amendments (NWPAA). Clark County's oversight program contains key elements ofmore » (1) technical and scientific analysis (2) transportation analysis (3) impact assessment and monitoring (4) policy and legislative analysis and monitoring, and (5) public outreach. Clark County has conducted numerous studies of potential impacts, many of which are summarized in Clark County's Impact Assessment Report that was submitted DOE and the President of the United States in February 2002. Given the unprecedented magnitude and duration of DOE's proposal, as well as the many unanswered questions about the transportation routes, number of shipments, and the modal mix that will ultimately be used, impacts to public health and safety and security, as well as socioeconomic impacts, can only be estimated. In order to refine these estimates, Clark County Comprehensive Planning Department's Nuclear Waste Division updates, assesses, and monitors impacts on a regular basis. Clark County's Impact Assessment program covers not only unincorporated Clark County but all five jurisdictions of Las Vegas, North Las Vegas, Henderson, Mesquite, and Boulder City as well as tribal jurisdictions that fall within Clark County's geographic boundary. National and global focus on nuclear power and nuclear waste could have significant impact on the Yucca Mountain Program, and therefore, Clark County's oversight of that program. (authors)« less
Kleinman, Steven; King, Melissa R; Busch, Michael P; Murphy, Edward L; Glynn, Simone A
2012-10-01
The Retrovirus Epidemiology Donor Study (REDS), conducted from 1989 to 2001, and the REDS-II, conducted from 2004 to 2012, were National Heart, Lung, and Blood Institute-funded, multicenter programs focused on improving blood safety and availability in the United States. The REDS-II also included international study sites in Brazil and China. The 3 major research domains of REDS/REDS-II have been infectious disease risk evaluation, blood donation availability, and blood donor characterization. Both programs have made significant contributions to transfusion medicine research methodology by the use of mathematical modeling, large-scale donor surveys, innovative methods of repository sample storage, and establishing an infrastructure that responded to potential emerging blood safety threats such as xenotropic murine leukemia virus-related virus. Blood safety studies have included protocols evaluating epidemiologic and/or laboratory aspects of human immunodeficiency virus, human T-lymphotropic virus 1/2, hepatitis C virus, hepatitis B virus, West Nile virus, cytomegalovirus, human herpesvirus 8, parvovirus B19, malaria, Creutzfeldt-Jakob disease, influenza, and Trypanosoma cruzi infections. Other analyses have characterized blood donor demographics, motivations to donate, factors influencing donor return, behavioral risk factors, donors' perception of the blood donation screening process, and aspects of donor deferral. In REDS-II, 2 large-scale blood donor protocols examined iron deficiency in donors and the prevalence of leukocyte antibodies. This review describes the major study results from over 150 peer-reviewed articles published by these 2 REDS programs. In 2011, a new 7-year program, the Recipient Epidemiology and Donor Evaluation Study-III, was launched. The Recipient Epidemiology and Donor Evaluation Study-III expands beyond donor-based research to include studies of blood transfusion recipients in the hospital setting and adds a third country, South Africa, to the international program. Copyright © 2012 Elsevier Inc. All rights reserved.
Klann, Jeffrey G; McCoy, Allison B; Wright, Adam; Wattanasin, Nich; Sittig, Dean F; Murphy, Shawn N
2013-05-30
The Strategic Health IT Advanced Research Projects (SHARP) program seeks to conquer well-understood challenges in medical informatics through breakthrough research. Two SHARP centers have found alignment in their methodological needs: (1) members of the National Center for Cognitive Informatics and Decision-making (NCCD) have developed knowledge bases to support problem-oriented summarizations of patient data, and (2) Substitutable Medical Apps, Reusable Technologies (SMART), which is a platform for reusable medical apps that can run on participating platforms connected to various electronic health records (EHR). Combining the work of these two centers will ensure wide dissemination of new methods for synthesized views of patient data. Informatics for Integrating Biology and the Bedside (i2b2) is an NIH-funded clinical research data repository platform in use at over 100 sites worldwide. By also working with a co-occurring initiative to SMART-enabling i2b2, we can confidently write one app that can be used extremely broadly. Our goal was to facilitate development of intuitive, problem-oriented views of the patient record using NCCD knowledge bases that would run in any EHR. To do this, we developed a collaboration between the two SHARPs and an NIH center, i2b2. First, we implemented collaborative tools to connect researchers at three institutions. Next, we developed a patient summarization app using the SMART platform and a previously validated NCCD problem-medication linkage knowledge base derived from the National Drug File-Reference Terminology (NDF-RT). Finally, to SMART-enable i2b2, we implemented two new Web service "cells" that expose the SMART application programming interface (API), and we made changes to the Web interface of i2b2 to host a "carousel" of SMART apps. We deployed our SMART-based, NDF-RT-derived patient summarization app in this SMART-i2b2 container. It displays a problem-oriented view of medications and presents a line-graph display of laboratory results. This summarization app can be run in any EHR environment that either supports SMART or runs SMART-enabled i2b2. This i2b2 "clinical bridge" demonstrates a pathway for reusable app development that does not require EHR vendors to immediately adopt the SMART API. Apps can be developed in SMART and run by clinicians in the i2b2 repository, reusing clinical data extracted from EHRs. This may encourage the adoption of SMART by supporting SMART app development until EHRs adopt the platform. It also allows a new variety of clinical SMART apps, fueled by the broad aggregation of data types available in research repositories. The app (including its knowledge base) and SMART-i2b2 are open-source and freely available for download.
Community participation in superfund practice and policy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gossett, L.B.
1995-12-01
Superfund has several statutory and regulatory provisions that provide vehicles for community involvement at Superfund sites including community relations plans, information repositories, public comment periods, and technical assistance grants to community organizations. There has been considerable debate about the effectiveness of these programs. The community participation requirement of the Superfund process are in a state of transition. The proposed Superfund Reform Act of 1994 contained additional community participation provisions. EPA appears to be incorporating some of these proposed changes and improvements learned from prior experiences into its current community relations practices. This study examines the status of community relations inmore » Superfund and the effectiveness of the community information and public participation programs in meeting legislative objectives. In addition to addressing current requirements and practices, the study looks at proposals to amend the community participation provisions as well as alternative approaches used by the EPA, potentially responsible parties, and citizens to address or resolve community concerns. While the focus will be on the overall program, a few brief selected case studies, representing a diversity of experiences, will be included. The resulting paper will discuss successes and shortcomings of community involvement in Superfund. It will address the sometimes competing goals of the various players in the Superfund process, bringing in not only the community perspective, but also concerns for decreased complexity and cost and increased efficiency. The conclusion will evaluate alternatives to improve procedures for community involvement in the Superfund program. Superfund reform, public and stakeholder involvement, and dispute resolution are addressed in this study. These are prominent, contemporary issues as the nation seeks to constructively solve its environmental problems.« less
Englebright, Jane; Westcott, Ruth; McManus, Kathryn; Kleja, Kacie; Helm, Colleen; Korwek, Kimberly M; Perlin, Jonathan B
2018-03-01
The prevention of hospital-acquired pressure ulcers (PrUs) has significant consequences for patient outcomes and the cost of care. Providers are challenged with evaluating available evidence and best practices, then implementing programs and motivating change in various facility environments. In a large system of community hospitals, the Reducing Hospital Acquired-PrUs Program was developed to provide a toolkit of best practices, timely and appropriate data for focusing efforts, and continuous implementation support. Baseline data on PrU rates helped focus efforts on the most vulnerable patients and care situations. Facilities were empowered to use and adapt available resources to meet local needs and to share best practices for implementation across the system. Outcomes were measured by the rate of hospital-acquired PrUs, as gathered from patient discharge records. The rate of hospital-acquired stage III and IV PrUs decreased 66.3% between 2011 and 2013. Of the 149 participating facilities, 40 (27%) had zero hospital-acquired stage III and IV PrUs and 77 (52%) had a reduction in their PrU rate. Rates of all PrUs documented as present on admission did not change during this period. A comparison of different strategies used by the most successful facilities illustrated the necessity of facility-level flexibility and recognition of local workflows and patient demographics. Driven by the combination of a repository of evidence-based tools and best practices, readily available data on PrU rates, and local flexibility with processes, the Reducing Hospital Acquired-PrUs Program represents the successful operationalization of improvement in a wide variety of facilities.
A Guide to Axial-Flow Turbine Off-Design Computer Program AXOD2
NASA Technical Reports Server (NTRS)
Chen, Shu-Cheng S.
2014-01-01
A Users Guide for the axial flow turbine off-design computer program AXOD2 is composed in this paper. This Users Guide is supplementary to the original Users Manual of AXOD. Three notable contributions of AXOD2 to its predecessor AXOD, both in the context of the Guide or in the functionality of the code, are described and discussed in length. These are: 1) a rational representation of the mathematical principles applied, with concise descriptions of the formulas implemented in the actual coding. Their physical implications are addressed; 2) the creation and documentation of an Addendum Listing of input namelist-parameters unique to AXOD2, that differ from or are in addition to the original input-namelists given in the Manual of AXOD. Their usages are discussed; and 3) the institution of proper stoppages of the code execution, encoding termination messaging and error messages of the execution to AXOD2. These measures are to safe-guard the integrity of the code execution, such that a failure mode encountered during a case-study would not plunge the code execution into indefinite loop, or cause a blow-out of the program execution. Details on these are discussed and illustrated in this paper. Moreover, this computer program has since been reconstructed substantially. Standard FORTRAN Langue was instituted, and the code was formatted in Double Precision (REAL*8). As the result, the code is now suited for use in a local Desktop Computer Environment, is perfectly portable to any Operating System, and can be executed by any FORTRAN compiler equivalent to a FORTRAN 9095 compiler. AXOD2 will be available through NASA Glenn Research Center (GRC) Software Repository.
A new version of a computer program for dynamical calculations of RHEED intensity oscillations
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej; Skrobas, Kazimierz
2006-01-01
We present a new version of the RHEED program which contains a graphical user interface enabling the use of the program in the graphical environment. The presented program also contains a graphical component which enables displaying program data at run-time through an easy-to-use graphical interface. New version program summaryTitle of program: RHEEDGr Catalogue identifier: ADWV Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWV Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Catalogue identifier of previous version: ADUY Authors of the original program: A. Daniluk Does the new version supersede the original program: no Computer for which the new version is designed and others on which it has been tested: Pentium-based PC Operating systems or monitors under which the new version has been tested: Windows 9x, XP, NT Programming language used: Borland C++ Builder Memory required to execute with typical data: more than 1 MB Number of bits in a word: 64 bits Number of processors used: 1 Number of lines in distributed program, including test data, etc.: 5797 Number of bytes in distributed program, including test data, etc.: 588 121 Distribution format: tar.gz Nature of physical problem: Reflection high-energy electron diffraction (RHEED) is a very useful technique for studying growth and surface analysis of thin epitaxial structures prepared by the molecular beam epitaxy (MBE). The RHEED technique can reveal, almost instantaneously, changes either in the coverage of the sample surface by adsorbates or in the surface structure of a thin film. Method of solution: RHEED intensities are calculated within the framework of the general matrix formulation of Peng and Whelan [1] under the one-beam condition. Reasons for the new version: Responding to the user feedback we designed a graphical package that enables displaying program data at run-time through an easy-to-use graphical interface. Summary of revisions:In the present form the code is an object-oriented extension of previous version [2]. Fig. 1 shows the static structure of classes and their possible relationships (i.e. inheritance, association, aggregation and dependency) in the code. The code has been modified and optimized to compile under the C++ Builder integrated development environment (IDE). A graphical user interface (GUI) for the program has been created. The application is a standard multiple document interface (MDI) project from Builder's object repository. The MDI application spawns child window that reside within the client window; the main form contains child object. We have added an original graphical component [3] which has been tested successfully in the C++ Builder programming environment under Microsoft Windows platform. Fig. 2 shows internal structure of the component. This diagram is a graphic presentation of the static view which shows a collection of declarative model elements, such as classes, types, and their relationships. Each of the model elements shown in Fig. 2 is manifested by one header file Graph2D.h, and one code file Graph2D.cpp. Fig. 3 sets the stage by showing the package which supplies the C++ Builder elements used in the component. Installation instructions of the TGraph2D.bpk package can be found in the new distribution. The program has been constructed according to the systems development live cycle (SDLC) methodology [4]. Typical running time: The typical running time is machine and user-parameters dependent. Unusual features of the program: The program is distributed in the form of a main project RHEEDGr.bpr with associated files, and should be compiled using Borland C++ Builder compilers version 5 or later.
Site Selection for the Disposal of LLW in Taiwan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, W.S.; Chi, L.M.; Tien, N.C.
2006-07-01
This paper presents the implementation status of the low-level radioactive waste (LLW) disposal program in Taiwan, including the disposal facility regulations, status of waste management, final disposal program, licensing procedures, waste acceptance criteria, site selection criteria and processes and preliminary disposal concepts. The first phase of site selection for low-level radioactive waste final disposal in Taiwan was implemented between 1992 and 2002. The site selection process adopted a Geographic Information System (GIS), Hierarchical Analysis System, Expert Evaluation System, and site reconnaissance. An incentive program for voluntary sites was also initiated. After a series of evaluations and discussion of 30 potentialmore » candidate sites, including 8 recommended sites, 5 qualified voluntary townships, and several remote uninhabited small islets, Hsiao-chiou islet was selected as the first priority candidate site in February 1998. The geological investigation work in Hsiao-chiou was conducted from March 1999 through October 2000. An Environmental Impact Statement Report (EIS) and the Investment Feasibility Study Report (IFS) were submitted to the Environmental Protection Agency (EPA) in November 2000 and to the Ministry of Economic Affairs (MOEA) in June 2001, respectively. Unfortunately, the site investigation was discontinued in 2002 due to political and public acceptance consideration. After years of planning, the second phase of the site selection process was launched in August 2004 and will be conducted through 2008. It is planned that a repository will be constructed in early 2009 and start to operate in 2014. The site selection process for the second phase is based on the earlier work and four potential candidate sites were selected for evaluation until 2005. A near surface disposal concept is proposed for a site located in the Taiwan strait, and cavern disposal concepts are proposed for three other sites located on the main island. This paper presents the implementation status of the LLW disposal program in Taiwan, including the disposal facility regulations, status of waste management, final disposal program, licensing procedures, waste acceptance criteria, site selection criteria and processes, and preliminary disposal concepts 'NIMBY' (Not in my backyard) is a critical problem for implementation of the final disposal project. Resistance from local communities has been continuously received during site characterization. To overcome this, an incentive program to encourage community acceptance has been approved by the Government. Programs for community promotion are being proposed and negotiations are also underway. (authors)« less
NASA Astrophysics Data System (ADS)
Klump, J. F.; Ulbricht, D.; Conze, R.
2014-12-01
The Continental Deep Drilling Programme (KTB) was a scientific drilling project from 1987 to 1995 near Windischeschenbach, Bavaria. The main super-deep borehole reached a depth of 9,101 meters into the Earth's continental crust. The project used the most current equipment for data capture and processing. After the end of the project key data were disseminated through the web portal of the International Continental Scientific Drilling Program (ICDP). The scientific reports were published as printed volumes. As similar projects have also experienced, it becomes increasingly difficult to maintain a data portal over a long time. Changes in software and underlying hardware make a migration of the entire system inevitable. Around 2009 the data presented on the ICDP web portal were migrated to the Scientific Drilling Database (SDDB) and published through DataCite using Digital Object Identifiers (DOI) as persistent identifiers. The SDDB portal used a relational database with a complex data model to store data and metadata. A PHP-based Content Management System with custom modifications made it possible to navigate and browse datasets using the metadata and then download datasets. The data repository software eSciDoc allows storing self-contained packages consistent with the OAIS reference model. Each package consists of binary data files and XML-metadata. Using a REST-API the packages can be stored in the eSciDoc repository and can be searched using the XML-metadata. During the last maintenance cycle of the SDDB the data and metadata were migrated into the eSciDoc repository. Discovery metadata was generated following the GCMD-DIF, ISO19115 and DataCite schemas. The eSciDoc repository allows to store an arbitrary number of XML-metadata records with each data object. In addition to descriptive metadata each data object may contain pointers to related materials, such as IGSN-metadata to link datasets to physical specimens, or identifiers of literature interpreting the data. Datasets are presented by XSLT-stylesheet transformation using the stored metadata. The presentation shows several migration cycles of data and metadata, which were driven by aging software systems. Currently the datasets reside as self-contained entities in a repository system that is ready for digital preservation.
A Python library for FAIRer access and deposition to the Metabolomics Workbench Data Repository.
Smelter, Andrey; Moseley, Hunter N B
2018-01-01
The Metabolomics Workbench Data Repository is a public repository of mass spectrometry and nuclear magnetic resonance data and metadata derived from a wide variety of metabolomics studies. The data and metadata for each study is deposited, stored, and accessed via files in the domain-specific 'mwTab' flat file format. In order to improve the accessibility, reusability, and interoperability of the data and metadata stored in 'mwTab' formatted files, we implemented a Python library and package. This Python package, named 'mwtab', is a parser for the domain-specific 'mwTab' flat file format, which provides facilities for reading, accessing, and writing 'mwTab' formatted files. Furthermore, the package provides facilities to validate both the format and required metadata elements of a given 'mwTab' formatted file. In order to develop the 'mwtab' package we used the official 'mwTab' format specification. We used Git version control along with Python unit-testing framework as well as continuous integration service to run those tests on multiple versions of Python. Package documentation was developed using sphinx documentation generator. The 'mwtab' package provides both Python programmatic library interfaces and command-line interfaces for reading, writing, and validating 'mwTab' formatted files. Data and associated metadata are stored within Python dictionary- and list-based data structures, enabling straightforward, 'pythonic' access and manipulation of data and metadata. Also, the package provides facilities to convert 'mwTab' files into a JSON formatted equivalent, enabling easy reusability of the data by all modern programming languages that implement JSON parsers. The 'mwtab' package implements its metadata validation functionality based on a pre-defined JSON schema that can be easily specialized for specific types of metabolomics studies. The library also provides a command-line interface for interconversion between 'mwTab' and JSONized formats in raw text and a variety of compressed binary file formats. The 'mwtab' package is an easy-to-use Python package that provides FAIRer utilization of the Metabolomics Workbench Data Repository. The source code is freely available on GitHub and via the Python Package Index. Documentation includes a 'User Guide', 'Tutorial', and 'API Reference'. The GitHub repository also provides 'mwtab' package unit-tests via a continuous integration service.
Thakur, P; Lemons, B G; White, C R
2016-09-15
After almost fifteen years of successful waste disposal operations, the first unambiguous airborne radiation release from the Waste Isolation Pilot Plant (WIPP) was detected beyond the site boundary on February 14, 2014. It was the first accident of its kind in the 15-year operating history of the WIPP. The accident released moderate levels of radioactivity into the underground air. A small but measurable amount of radioactivity also escaped to the surface through the ventilation system and was detected above ground. The dominant radionuclides released were americium and plutonium, in a ratio consistent with the known content of a breached drum. The radiation release was caused by a runaway chemical reaction inside a transuranic (TRU) waste drum which experienced a seal and lid failure, spewing radioactive materials into the repository. According to source-term estimation, approximately 2 to 10Ci of radioactivity was released from the breached drum into the underground, and an undetermined fraction of that source term became airborne, setting off an alarm and triggering the closure of seals designed to force exhausting air through a system of filters including high-efficiency-particulate-air (HEPA) filters. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment, if any. This article attempts to compile and interpret analytical data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and by a compliance-monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP), LLC., in response to the accident. Both the independent and the WIPP monitoring efforts concluded that the levels detected were very low and localized, and no radiation-related health effects among local workers or the public would be expected. Published by Elsevier B.V.
Benefits of International Collaboration on the International Space Station
NASA Technical Reports Server (NTRS)
Hasbrook, Pete; Robinson, Julie A.; Cohen, Luchino; Marcil, Isabelle; De Parolis, Lina; Hatton, Jason; Shirakawa, Masaki; Karabadzhak, Georgy; Sorokin, Igor V.; Valentini, Giovanni
2017-01-01
The International Space Station is a valuable platform for research in space, but the benefits are limited if research is only conducted by individual countries. Through the efforts of the ISS Program Science Forum, international science working groups, and interagency cooperation, international collaboration on the ISS has expanded as ISS utilization has matured. Members of science teams benefit from working with counterparts in other countries. Scientists and institutions bring years of experience and specialized expertise to collaborative investigations, leading to new perspectives and approaches to scientific challenges. Combining new ideas and historical results brings synergy and improved peer-reviewed scientific methods and results. World-class research facilities can be expensive and logistically complicated, jeopardizing their full utilization. Experiments that would be prohibitively expensive for a single country can be achieved through contributions of resources from two or more countries, such as crew time, up- and down mass, and experiment hardware. Cooperation also avoids duplication of experiments and hardware among agencies. Biomedical experiments can be completed earlier if astronauts or cosmonauts from multiple agencies participate. Countries responding to natural disasters benefit from ISS imagery assets, even if the country has no space agency of its own. Students around the world participate in ISS educational opportunities, and work with students in other countries, through open curriculum packages and through international competitions. Even experiments conducted by a single country can benefit scientists around the world, through specimen sharing programs and publicly accessible "open data" repositories. For ISS data, these repositories include GeneLab, the Physical Science Informatics System, and different Earth data systems. Scientists can conduct new research using ISS data without having to launch and execute their own experiments. Multilateral collections of research results publications, maintained by the ISS international partnership and accessible via nasa.gov, make ISS results available worldwide, and encourage new users, ideas and research.
NASA Astrophysics Data System (ADS)
Thomas, M. A.
2016-12-01
The Waste Isolation Pilot Plant (WIPP) is the only deep geological repository for transuranic waste in the United States. As the Science Advisor for the WIPP, Sandia National Laboratories annually evaluates site data against trigger values (TVs), metrics whose violation is indicative of conditions that may impact long-term repository performance. This study focuses on a groundwater-quality dataset used to redesign a TV for the Culebra Dolomite Member (Culebra) of the Permian-age Rustler Formation. Prior to this study, a TV violation occurred if the concentration of a major ion fell outside a range defined as the mean +/- two standard deviations. The ranges were thought to denote conditions that 95% of future values would fall within. Groundwater-quality data used in evaluating compliance, however, are rarely normally distributed. To create a more robust Culebra groundwater-quality TV, this study employed the randomization test, a non-parametric permutation method. Recent groundwater compositions considered TV violations under the original ion concentration ranges are now interpreted as false positives in light of the insignificant p-values calculated with the randomization test. This work highlights that the normality assumption can weaken as the size of a groundwater-quality dataset grows over time. Non-parametric permutation methods are an attractive option because no assumption about the statistical distribution is required and calculating all combinations of the data is an increasingly tractable problem with modern workstations. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy. SAND2016-7306A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thrower, A.; Best, R.; Finewood, L.
2008-07-01
The Department of Energy's (DOE's) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing and implementing a safe, secure and efficient transportation system to ship spent nuclear fuel (SNF) and high-level radioactive waste (HLW) from commercial and DOE sites to the proposed Yucca Mountain repository. The Office of Logistics Management (OLM) within OCRWM has begun to work with stakeholders to identify preliminary national suites of highway and rail routes that could be used for future shipments OLM is striving to develop a planning-basis set of routes that will support long-lead time logistical analyses (i.e., five or more yearsmore » before shipment). The results will represent a starting point for discussions between DOE and corridor jurisdictions, and for shipping arrangements between DOE and carriers. This fulfills a recommendation of the National Academy of Sciences report on SNF and HLW transportation that 'DOE should identify and make public its suite of preferred highway and rail routes for transporting spent fuel and high level waste to a federal repository as soon as practicable to support State, Tribal and local planning, especially for emergency responder preparedness'. OLM encourages and supports participation of program stakeholders in a process to identify suites of national routes. The principal objective is to identify preliminary suites of national routes that reflect responsible consideration of the interests of a broad cross-section of stakeholders. This will facilitate transportation planning activities to help meet program goals, including providing an advanced planning framework for State and Tribal authorities; supporting a pilot program for providing funding under Section 180(c) of the Nuclear Waste Policy Act; allowing sufficient time for security and operational reviews in advance of shipments to Yucca Mountain; and supporting utility planning and readiness for transportation operations. Concepts for routing and routing criteria have been considered by several state regional groups supported by cooperative agreements with OLM. OCRWM is also working with other Federal agencies, transportation service providers and others involved in the transportation industry to ensure the criteria are consistent with operating practices and regulations. These coordination efforts will ensure the experience, knowledge, and expertise of those involved are considered in the process to identify the preliminary national suites of routes. This paper describes the current process and timeline for preliminary identification and analyses of routes. In conclusion: The path toward developing a safe, secure, and efficient transportation system for shipments of SNF and HLW to Yucca Mountain will require the participation of many interested parties. Real cooperative planning is sometimes challenging, and requires a commitment from all involved parties to act in good faith and to employ their best efforts in developing mutually beneficial solutions. Identifying routes to the proposed repository at Yucca Mountain, and engaging in planning and preparedness activities with affected jurisdictions and other stakeholders, will take time. OCRWM is committed to a cooperative approach that will ultimately enhance safety, security, efficiency and public confidence. (authors)« less
Laboratory Testing of Waste Isolation Pilot Plant Surrogate Waste Materials
NASA Astrophysics Data System (ADS)
Broome, S.; Bronowski, D.; Pfeifle, T.; Herrick, C. G.
2011-12-01
The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy geological repository for the permanent disposal of defense-related transuranic (TRU) waste. The waste is emplaced in rooms excavated in the bedded Salado salt formation at a depth of 655 m below the ground surface. After emplacement of the waste, the repository will be sealed and decommissioned. WIPP Performance Assessment modeling of the underground material response requires a full and accurate understanding of coupled mechanical, hydrological, and geochemical processes and how they evolve with time. This study was part of a broader test program focused on room closure, specifically the compaction behavior of waste and the constitutive relations to model this behavior. The goal of this study was to develop an improved waste constitutive model. The model parameters are developed based on a well designed set of test data. The constitutive model will then be used to realistically model evolution of the underground and to better understand the impacts on repository performance. The present study results are focused on laboratory testing of surrogate waste materials. The surrogate wastes correspond to a conservative estimate of the degraded containers and TRU waste materials after the 10,000 year regulatory period. Testing consists of hydrostatic, uniaxial, and triaxial tests performed on surrogate waste recipes that were previously developed by Hansen et al. (1997). These recipes can be divided into materials that simulate 50% and 100% degraded waste by weight. The percent degradation indicates the anticipated amount of iron corrosion, as well as the decomposition of cellulosics, plastics, and rubbers. Axial, lateral, and volumetric strain and axial and lateral stress measurements were made. Two unique testing techniques were developed during the course of the experimental program. The first involves the use of dilatometry to measure sample volumetric strain under a hydrostatic condition. Bulk moduli of the samples measured using this technique were consistent with those measured using more conventional methods. The second technique involved performing triaxial tests under lateral strain control. By limiting the lateral strain to zero by controlling the applied confining pressure while loading the specimen axially in compression, one can maintain a right-circular cylindrical geometry even under large deformations. This technique is preferred over standard triaxial testing methods which result in inhomogeneous deformation or "barreling". Manifestations of the inhomogeneous deformation included non-uniform stress states, as well as unrealistic Poisson's ratios (> 0.5) or those that vary significantly along the length of the specimen. Zero lateral strain controlled tests yield a more uniform stress state, and admissible and uniform values of Poisson's ratio. Hansen, F.D., Knowles, M.K., et al. 1997. Description and Evaluation of a Mechanistically Based Conceptual Model for Spall. SAND97-1369. Sandia National Laboratories, Albuquerque. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
The Site-Scale Saturated Zone Flow Model for Yucca Mountain
NASA Astrophysics Data System (ADS)
Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.
2006-12-01
This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Optimizing Crawler4j using MapReduce Programming Model
NASA Astrophysics Data System (ADS)
Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.
2017-06-01
World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.
Integrity Constraint Monitoring in Software Development: Proposed Architectures
NASA Technical Reports Server (NTRS)
Fernandez, Francisco G.
1997-01-01
In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.
RGG: A general GUI Framework for R scripts
Visne, Ilhami; Dilaveroglu, Erkan; Vierlinger, Klemens; Lauss, Martin; Yildiz, Ahmet; Weinhaeusel, Andreas; Noehammer, Christa; Leisch, Friedrich; Kriegner, Albert
2009-01-01
Background R is the leading open source statistics software with a vast number of biostatistical and bioinformatical analysis packages. To exploit the advantages of R, extensive scripting/programming skills are required. Results We have developed a software tool called R GUI Generator (RGG) which enables the easy generation of Graphical User Interfaces (GUIs) for the programming language R by adding a few Extensible Markup Language (XML) – tags. RGG consists of an XML-based GUI definition language and a Java-based GUI engine. GUIs are generated in runtime from defined GUI tags that are embedded into the R script. User-GUI input is returned to the R code and replaces the XML-tags. RGG files can be developed using any text editor. The current version of RGG is available as a stand-alone software (RGGRunner) and as a plug-in for JGR. Conclusion RGG is a general GUI framework for R that has the potential to introduce R statistics (R packages, built-in functions and scripts) to users with limited programming skills and helps to bridge the gap between R developers and GUI-dependent users. RGG aims to abstract the GUI development from individual GUI toolkits by using an XML-based GUI definition language. Thus RGG can be easily integrated in any software. The RGG project further includes the development of a web-based repository for RGG-GUIs. RGG is an open source project licensed under the Lesser General Public License (LGPL) and can be downloaded freely at PMID:19254356
OWL-based reasoning methods for validating archetypes.
Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2013-04-01
Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Luik, Abraham; Patterson, Russell; Nelson, Roger
2013-07-01
The Waste Isolation Pilot Plant (WIPP) is a geologic repository 2150 feet (650 m) below the surface of the Chihuahuan desert near Carlsbad, New Mexico. WIPP permanently disposes of transuranic waste from national defense programs. Every five years, the U.S. Department of Energy (DOE) submits an application to the U.S. Environmental Protection Agency (EPA) to request regulatory-compliance re-certification of the facility for another five years. Every ten years, DOE submits an application to the New Mexico Environment Department (NMED) for the renewal of its hazardous waste disposal permit. The content of the applications made by DOE to the EPA formore » re-certification, and to the NMED for permit-renewal, reflect any optimization changes made to the facility, with regulatory concurrence if warranted by the nature of the change. DOE points to such changes as evidence for its having taken seriously its 'continuous improvement' operations and management philosophy. Another opportunity for continuous improvement is to look at any delta that may exist between the re-certification and re-permitting cases for system safety and the consensus advice on the nature and content of a safety case as being developed and published by the Nuclear Energy Agency's Integration Group for the Safety Case (IGSC) expert group. DOE at WIPP, with the aid of its Science Advisor and teammate, Sandia National Laboratories, is in the process of discerning what can be done, in a reasonably paced and cost-conscious manner, to continually improve the case for repository safety that is being made to the two primary regulators on a recurring basis. This paper will discuss some aspects of that delta and potential paths forward to addressing them. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
George, James T.; Sobolik, Steven R.; Lee, Moo Y.
The study described in this report involves heated and unheated pressurized slot testing to determine thermo-mechanical properties of the Tptpll (Tertiary, Paintbrush, Topopah Spring Tuff Formation, crystal poor, lower lithophysal) and Tptpul (upper lithophysal) lithostratigraphic units at Yucca Mountain, Nevada. A large volume fraction of the proposed repository at Yucca Mountain may reside in the Tptpll lithostratigraphic unit. This unit is characterized by voids, or lithophysae, which range in size from centimeters to meters, making a field program an effective method of measuring bulk thermal-mechanical rock properties (thermal expansion, rock mass modulus, compressive strength, time-dependent deformation) over a range ofmore » temperature and rock conditions. The field tests outlined in this report provide data for the determination of thermo-mechanical properties of this unit. Rock-mass response data collected during this field test will reduce the uncertainty in key thermal-mechanical modeling parameters (rock-mass modulus, strength and thermal expansion) for the Tptpll lithostratigraphic unit, and provide a basis for understanding thermal-mechanical behavior of this unit. The measurements will be used to evaluate numerical models of the thermal-mechanical response of the repository. These numerical models are then used to predict pre- and post-closure repository response. ACKNOWLEDGEMENTS The authors would like to thank David Bronowski, Ronnie Taylor, Ray E. Finley, Cliff Howard, Michael Schuhen (all SNL) and Fred Homuth (LANL) for their work in the planning and implementation of the tests described in this report. This is a reprint of SAND2004-2703, which was originally printed in July 2004. At that time, it was printed for a restricted audience. It has now been approved for unlimited release.« less
The Open Data Repositorys Data Publisher
NASA Technical Reports Server (NTRS)
Stone, N.; Lafuente, B.; Downs, R. T.; Blake, D.; Bristow, T.; Fonda, M.; Pires, A.
2015-01-01
Data management and data publication are becoming increasingly important components of researcher's workflows. The complexity of managing data, publishing data online, and archiving data has not decreased significantly even as computing access and power has greatly increased. The Open Data Repository's Data Publisher software strives to make data archiving, management, and publication a standard part of a researcher's workflow using simple, web-based tools and commodity server hardware. The publication engine allows for uploading, searching, and display of data with graphing capabilities and downloadable files. Access is controlled through a robust permissions system that can control publication at the field level and can be granted to the general public or protected so that only registered users at various permission levels receive access. Data Publisher also allows researchers to subscribe to meta-data standards through a plugin system, embargo data publication at their discretion, and collaborate with other researchers through various levels of data sharing. As the software matures, semantic data standards will be implemented to facilitate machine reading of data and each database will provide a REST application programming interface for programmatic access. Additionally, a citation system will allow snapshots of any data set to be archived and cited for publication while the data itself can remain living and continuously evolve beyond the snapshot date. The software runs on a traditional LAMP (Linux, Apache, MySQL, PHP) server and is available on GitHub (http://github.com/opendatarepository) under a GPLv2 open source license. The goal of the Open Data Repository is to lower the cost and training barrier to entry so that any researcher can easily publish their data and ensure it is archived for posterity.
NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcus Milling
2003-04-01
The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less
NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcus Milling
2002-10-01
The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less
Cimino, James J.; Ayres, Elaine J.; Remennik, Lyubov; Rath, Sachi; Freedman, Robert; Beri, Andrea; Chen, Yang; Huser, Vojtech
2013-01-01
The US National Institutes of Health (NIH) has developed the Biomedical Translational Research Information System (BTRIS) to support researchers’ access to translational and clinical data. BTRIS includes a data repository, a set of programs for loading data from NIH electronic health records and research data management systems, an ontology for coding the disparate data with a single terminology, and a set of user interface tools that provide access to identified data from individual research studies and data across all studies from which individually identifiable data have been removed. This paper reports on unique design elements of the system, progress to date and user experience after five years of development and operation. PMID:24262893
Software support for Huntingtons disease research.
Conneally, P. M.; Gersting, J. M.; Gray, J. M.; Beidleman, K.; Wexler, N. S.; Smith, C. L.
1991-01-01
Huntingtons disease (HD) is a hereditary disorder involving the central nervous system. Its effects are devastating, to the affected person as well as his family. The Department of Medical and Molecular Genetics at Indiana University (IU) plays an integral part in Huntingtons research by providing computerized repositories of HD family information for researchers and families. The National Huntingtons Disease Research Roster, founded in 1979 at IU, and the Huntingtons Disease in Venezuela Project database contain information that has proven to be invaluable in the worldwide field of HD research. This paper addresses the types of information stored in each database, the pedigree database program (MEGADATS) used to manage the data, and significant findings that have resulted from access to the data. PMID:1839672
WastePD, an innovative center on materials degradation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frankel, Gerald S.; Vienna, John; Lian, Jie
The US Department of Energy recently awarded funds to create the Center for Performance and Design of Nuclear Waste Forms and Containers (WastePD) as part of the Energy Frontier Research Center (EFRC) program. EFRCs are multi-investigator collaborations of universities, national labs and companies that “conduct fundamental research focusing on one or more “grand challenges” and use-inspired “basic research needs” identified in major strategic planning efforts by the scientific community.” The major performance parameter of nuclear waste forms is their ability to isolate the radionuclides by withstanding degradation in a repository environment over very long periods of time. So WastePD ismore » at heart a center focused on materials degradation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-09-01
The U.S. Department of Energy (DOE) is considering the selection of a strategy for the long-term management of the defense high-level wastes at the Idaho Chemical Processing Plant (ICPP). This report describes the environmental impacts of alternative strategies. These alternative strategies include leaving the calcine in its present form at the Idaho National Engineering Laboratory (INEL), or retrieving and modifying the calcine to a more durable waste form and disposing of it either at the INEL or in an offsite repository. This report addresses only the alternatives for a program to manage the high-level waste generated at the ICPP. 24more » figures, 60 tables.« less
Java Web Simulation (JWS); a web based database of kinetic models.
Snoep, J L; Olivier, B G
2002-01-01
Software to make a database of kinetic models accessible via the internet has been developed and a core database has been set up at http://jjj.biochem.sun.ac.za/. This repository of models, available to everyone with internet access, opens a whole new way in which we can make our models public. Via the database, a user can change enzyme parameters and run time simulations or steady state analyses. The interface is user friendly and no additional software is necessary. The database currently contains 10 models, but since the generation of the program code to include new models has largely been automated the addition of new models is straightforward and people are invited to submit their models to be included in the database.
Strategies from a nationwide health information technology implementation: the VA CART story.
Box, Tamára L; McDonell, Mary; Helfrich, Christian D; Jesse, Robert L; Fihn, Stephan D; Rumsfeld, John S
2010-01-01
The VA Cardiovascular Assessment, Reporting, and Tracking (CART) system is a customized electronic medical record system which provides standardized report generation for cardiac catheterization procedures, serves as a national data repository, and is the centerpiece of a national quality improvement program. Like many health information technology projects, CART implementation did not proceed without some barriers and resistance. We describe the nationwide implementation of CART at the 77 VA hospitals which perform cardiac catheterizations in three phases: (1) strategic collaborations; (2) installation; and (3) adoption. Throughout implementation, success required a careful balance of technical, clinical, and organizational factors. We offer strategies developed through CART implementation which are broadly applicable to technology projects aimed at improving the quality, reliability, and efficiency of health care.
EPA Facility Registry Service (FRS): ICIS
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Integrated Compliance Information System (ICIS). When complete, ICIS will provide a database that will contain integrated enforcement and compliance information across most of EPA's programs. The vision for ICIS is to replace EPA's independent databases that contain enforcement data with a single repository for that information. Currently, ICIS contains all Federal Administrative and Judicial enforcement actions and a subset of the Permit Compliance System (PCS), which supports the National Pollutant Discharge Elimination System (NPDES). ICIS exchanges non-sensitive enforcement/compliance activities, non-sensitive formal enforcement actions and NPDES information with FRS. This web feature service contains the enforcement/compliance activities and formal enforcement action related facilities; the NPDES facilities are contained in the PCS_NPDES web feature service. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on f
RSAT: regulatory sequence analysis tools.
Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques
2008-07-01
The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aevermann, Brian D.; Pickett, Brett E.; Kumar, Sanjeev
The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe acute respiratory syndrome coronavirus (SARS-CoV), were investigated. Study validation was demonstrated via experimental quality control measures and meta-analysis of independent experiments performed under similar conditions. Primary assay results are archivedmore » at the GEO and PeptideAtlas public repositories, while processed statistical results together with standardized metadata are publically available at the Influenza Research Database (www.fludb.org) and the Virus Pathogen Resource (www.viprbrc.org). As a result, by comparing data from mutant versus wild-type virus and host strains, RNA versus protein differential expression, and infection with genetically similar strains, these data can be used to further investigate genetic and physiological determinants of host responses to viral infection.« less
Aevermann, Brian D.; Pickett, Brett E.; Kumar, Sanjeev; ...
2014-10-14
The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe acute respiratory syndrome coronavirus (SARS-CoV), were investigated. Study validation was demonstrated via experimental quality control measures and meta-analysis of independent experiments performed under similar conditions. Primary assay results are archivedmore » at the GEO and PeptideAtlas public repositories, while processed statistical results together with standardized metadata are publically available at the Influenza Research Database (www.fludb.org) and the Virus Pathogen Resource (www.viprbrc.org). As a result, by comparing data from mutant versus wild-type virus and host strains, RNA versus protein differential expression, and infection with genetically similar strains, these data can be used to further investigate genetic and physiological determinants of host responses to viral infection.« less
OntoSoft: A Software Registry for Geosciences
NASA Astrophysics Data System (ADS)
Garijo, D.; Gil, Y.
2017-12-01
The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.
NASA Astrophysics Data System (ADS)
Baker, K. S.; Chandler, C. L.
2008-12-01
Data management and informatics research are in a state of change in terms of data practices, information strategies, and roles. New ways of thinking about data and data management can facilitate interdisciplinary global ocean science. To meet contemporary expectations for local data use and reuse by a variety of audiences, collaborative strategies involving diverse teams of information professionals are developing. Such changes are fostering the growth of information infrastructures that support multi-scale sampling, data integration, and nascent networks of data repositories. In this retrospective, two examples of oceanographic projects incorporating data management in partnership with long-term science programs are reviewed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned - short-term and long-term - from a decade of data management within these two communities will be presented. A conceptual framework called Ocean Informatics provides one example for managing the complexities inherent to sharing oceanographic data. Elements are discussed that address the economies-of-scale as well as the complexities-of-scale pertinent to a broad vision of information management and scientific research.
The Cancer Genomics Hub (CGHub): overcoming cancer through the power of torrential data
Wilks, Christopher; Cline, Melissa S.; Weiler, Erich; Diehkans, Mark; Craft, Brian; Martin, Christy; Murphy, Daniel; Pierce, Howdy; Black, John; Nelson, Donavan; Litzinger, Brian; Hatton, Thomas; Maltbie, Lori; Ainsworth, Michael; Allen, Patrick; Rosewood, Linda; Mitchell, Elizabeth; Smith, Bradley; Warner, Jim; Groboske, John; Telc, Haifang; Wilson, Daniel; Sanford, Brian; Schmidt, Hannes; Haussler, David; Maltbie, Daniel
2014-01-01
The Cancer Genomics Hub (CGHub) is the online repository of the sequencing programs of the National Cancer Institute (NCI), including The Cancer Genomics Atlas (TCGA), the Cancer Cell Line Encyclopedia (CCLE) and the Therapeutically Applicable Research to Generate Effective Treatments (TARGET) projects, with data from 25 different types of cancer. The CGHub currently contains >1.4 PB of data, has grown at an average rate of 50 TB a month and serves >100 TB per week. The architecture of CGHub is designed to support bulk searching and downloading through a Web-accessible application programming interface, enforce patient genome confidentiality in data storage and transmission and optimize for efficiency in access and transfer. In this article, we describe the design of these three components, present performance results for our transfer protocol, GeneTorrent, and finally report on the growth of the system in terms of data stored and transferred, including estimated limits on the current architecture. Our experienced-based estimates suggest that centralizing storage and computational resources is more efficient than wide distribution across many satellite labs. Database URL: https://cghub.ucsc.edu PMID:25267794
A Software Architecture for Intelligent Synthesis Environments
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Norvig, Peter (Technical Monitor)
2001-01-01
The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.
Case-oriented computer-based-training in radiology: concept, implementation and evaluation
Dugas, Martin; Trumm, Christoph; Stäbler, Axel; Pander, Ernst; Hundt, Walter; Scheidler, Jurgen; Brüning, Roland; Helmberger, Thomas; Waggershauser, Tobias; Matzko, Matthias; Reiser, Maximillian
2001-01-01
Background Providing high-quality clinical cases is important for teaching radiology. We developed, implemented and evaluated a program for a university hospital to support this task. Methods The system was built with Intranet technology and connected to the Picture Archiving and Communications System (PACS). It contains cases for every user group from students to attendants and is structured according to the ACR-code (American College of Radiology) [2]. Each department member was given an individual account, could gather his teaching cases and put the completed cases into the common database. Results During 18 months 583 cases containing 4136 images involving all radiological techniques were compiled and 350 cases put into the common case repository. Workflow integration as well as individual interest influenced the personal efforts to participate but an increasing number of cases and minor modifications of the program improved user acceptance continuously. 101 students went through an evaluation which showed a high level of acceptance and a special interest in elaborate documentation. Conclusion Electronic access to reference cases for all department members anytime anywhere is feasible. Critical success factors are workflow integration, reliability, efficient retrieval strategies and incentives for case authoring. PMID:11686856
NASA Astrophysics Data System (ADS)
Baker, Karen S.; Chandler, Cynthia L.
2008-09-01
Interdisciplinary global ocean science requires new ways of thinking about data and data management. With new data policies and growing technological capabilities, datasets of increasing variety and complexity are being made available digitally and data management is coming to be recognized as an integral part of scientific research. To meet the changing expectations of scientists collecting data and of data reuse by others, collaborative strategies involving diverse teams of information professionals are developing. These changes are stimulating the growth of information infrastructures that support multi-scale sampling, data repositories, and data integration. Two examples of oceanographic projects incorporating data management in partnership with science programs are discussed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned from a decade of data management within these communities provide an experience base from which to develop information management strategies—short-term and long-term. Ocean Informatics provides one example of a conceptual framework for managing the complexities inherent to sharing oceanographic data. Elements are introduced that address the economies-of-scale and the complexities-of-scale pertinent to a broader vision of information management and scientific research.
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
Aevermann, Brian D; Pickett, Brett E; Kumar, Sanjeev; Klem, Edward B; Agnihothram, Sudhakar; Askovich, Peter S; Bankhead, Armand; Bolles, Meagen; Carter, Victoria; Chang, Jean; Clauss, Therese R W; Dash, Pradyot; Diercks, Alan H; Eisfeld, Amie J; Ellis, Amy; Fan, Shufang; Ferris, Martin T; Gralinski, Lisa E; Green, Richard R; Gritsenko, Marina A; Hatta, Masato; Heegel, Robert A; Jacobs, Jon M; Jeng, Sophia; Josset, Laurence; Kaiser, Shari M; Kelly, Sara; Law, G Lynn; Li, Chengjun; Li, Jiangning; Long, Casey; Luna, Maria L; Matzke, Melissa; McDermott, Jason; Menachery, Vineet; Metz, Thomas O; Mitchell, Hugh; Monroe, Matthew E; Navarro, Garnet; Neumann, Gabriele; Podyminogin, Rebecca L; Purvine, Samuel O; Rosenberger, Carrie M; Sanders, Catherine J; Schepmoes, Athena A; Shukla, Anil K; Sims, Amy; Sova, Pavel; Tam, Vincent C; Tchitchek, Nicolas; Thomas, Paul G; Tilton, Susan C; Totura, Allison; Wang, Jing; Webb-Robertson, Bobbie-Jo; Wen, Ji; Weiss, Jeffrey M; Yang, Feng; Yount, Boyd; Zhang, Qibin; McWeeney, Shannon; Smith, Richard D; Waters, Katrina M; Kawaoka, Yoshihiro; Baric, Ralph; Aderem, Alan; Katze, Michael G; Scheuermann, Richard H
2014-01-01
The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe acute respiratory syndrome coronavirus (SARS-CoV), were investigated. Study validation was demonstrated via experimental quality control measures and meta-analysis of independent experiments performed under similar conditions. Primary assay results are archived at the GEO and PeptideAtlas public repositories, while processed statistical results together with standardized metadata are publically available at the Influenza Research Database (www.fludb.org) and the Virus Pathogen Resource (www.viprbrc.org). By comparing data from mutant versus wild-type virus and host strains, RNA versus protein differential expression, and infection with genetically similar strains, these data can be used to further investigate genetic and physiological determinants of host responses to viral infection.
BioSPICE: access to the most current computational tools for biologists.
Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark
2003-01-01
The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.
Adaptive dynamic programming approach to experience-based systems identification and control.
Lendaris, George G
2009-01-01
Humans have the ability to make use of experience while selecting their control actions for distinct and changing situations, and their process speeds up and have enhanced effectiveness as more experience is gained. In contrast, current technological implementations slow down as more knowledge is stored. A novel way of employing Approximate (or Adaptive) Dynamic Programming (ADP) is described that shifts the underlying Adaptive Critic type of Reinforcement Learning method "up a level", away from designing individual (optimal) controllers to that of developing on-line algorithms that efficiently and effectively select designs from a repository of existing controller solutions (perhaps previously developed via application of ADP methods). The resulting approach is called Higher-Level Learning Algorithm. The approach and its rationale are described and some examples of its application are given. The notions of context and context discernment are important to understanding the human abilities noted above. These are first defined, in a manner appropriate to controls and system-identification, and as a foundation relating to the application arena, a historical view of the various phases during development of the controls field is given, organized by how the notion 'context' was, or was not, involved in each phase.
Aevermann, Brian D.; Pickett, Brett E.; Kumar, Sanjeev; Klem, Edward B.; Agnihothram, Sudhakar; Askovich, Peter S.; Bankhead, Armand; Bolles, Meagen; Carter, Victoria; Chang, Jean; Clauss, Therese R.W.; Dash, Pradyot; Diercks, Alan H.; Eisfeld, Amie J.; Ellis, Amy; Fan, Shufang; Ferris, Martin T.; Gralinski, Lisa E.; Green, Richard R.; Gritsenko, Marina A.; Hatta, Masato; Heegel, Robert A.; Jacobs, Jon M.; Jeng, Sophia; Josset, Laurence; Kaiser, Shari M.; Kelly, Sara; Law, G. Lynn; Li, Chengjun; Li, Jiangning; Long, Casey; Luna, Maria L.; Matzke, Melissa; McDermott, Jason; Menachery, Vineet; Metz, Thomas O.; Mitchell, Hugh; Monroe, Matthew E.; Navarro, Garnet; Neumann, Gabriele; Podyminogin, Rebecca L.; Purvine, Samuel O.; Rosenberger, Carrie M.; Sanders, Catherine J.; Schepmoes, Athena A.; Shukla, Anil K.; Sims, Amy; Sova, Pavel; Tam, Vincent C.; Tchitchek, Nicolas; Thomas, Paul G.; Tilton, Susan C.; Totura, Allison; Wang, Jing; Webb-Robertson, Bobbie-Jo; Wen, Ji; Weiss, Jeffrey M.; Yang, Feng; Yount, Boyd; Zhang, Qibin; McWeeney, Shannon; Smith, Richard D.; Waters, Katrina M.; Kawaoka, Yoshihiro; Baric, Ralph; Aderem, Alan; Katze, Michael G.; Scheuermann, Richard H.
2014-01-01
The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe acute respiratory syndrome coronavirus (SARS-CoV), were investigated. Study validation was demonstrated via experimental quality control measures and meta-analysis of independent experiments performed under similar conditions. Primary assay results are archived at the GEO and PeptideAtlas public repositories, while processed statistical results together with standardized metadata are publically available at the Influenza Research Database (www.fludb.org) and the Virus Pathogen Resource (www.viprbrc.org). By comparing data from mutant versus wild-type virus and host strains, RNA versus protein differential expression, and infection with genetically similar strains, these data can be used to further investigate genetic and physiological determinants of host responses to viral infection. PMID:25977790
U.S. EPA Superfund Program's Policy for Community Involvement at Radioactively Contaminated Sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carey, Pat; Walker, Stuart
2008-01-15
This paper describes the Superfund program's statutory requirements for community involvement. It also discusses the efforts the Superfund program has made that go beyond these statutory requirements to involve communities. The Environmental Protection Agency (EPA) implements the Superfund program under the authority of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA), as amended by the Superfund Amendments and Reauthorization Act of 1986 (SARA). From the beginning of the Superfund program, Congress envisioned a role for communities. This role has evolved and expanded during the implementation of the Superfund program. Initially, the CERCLA statute had community involvement requirementsmore » designed to inform surrounding communities of the work being done at a site. CERCLA's provisions required 1) development of a community relations plan for each site, 2) establishment of information repositories near each site where all publicly available materials related to the site would be accessible for public inspection, 3) opportunities for the public to comment on the proposed remedy for each site and 4) development of a responsiveness summary responding to all significant comments received on the proposed remedy. In recognition of the need for people living near Superfund sites to be well-informed and involved with decisions concerning sites in their communities, SARA expanded Superfund's community involvement activities in 1986. SARA provided the authority to award Technical Assistance Grants (TAGs) to local communities enabling them to hire independent technical advisors to assist them in understanding technical issues and data about the site. The Superfund Community Involvement Program has sought to effectively implement the statutory community involvement requirements, and to go beyond those requirements to find meaningful ways to involve citizens in the cleanup of sites in their communities. We've structured our program around two main themes, building capacity in staff, and building capacity in Communities. In summary, the Superfund program devotes substantial resources to involving the local community in the site cleanup decision making process. We believe community involvement provides us with highly valuable information that must be available to carefully consider remedial alternatives at a site. We also find our employees enjoy their jobs more. Rather than fighting with an angry public they can work collaboratively to solve the problems created by the hazardous waste sites. We have learned the time and resources we devote at the beginning of a project to developing relationships with the local community, and learning about their issues and concerns is time and resources well spent. We believe the evidence shows this up-front investment helps us make better cleanup decisions, and avoids last minute efforts to work with a hostile community who feels left out of the decision-making process.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Triay, I.R.; Cotter, C.R.; Huddleston, M.H.
1996-09-01
We studied the sorption of neptunium onto tuffs characteristic of the proposed nuclear waste repository at Yucca Mountain, Nevada. The neptunium was in the Np(V) oxidation state under oxidizing conditions in groundwaters from two wells located close to the repository site (J-13 and UE-25 p No.1). We used devitrified, vitric, zeolitic (with emphasis on clinoptilolite-rich samples), and calcite-rich tuffs characteristic of the geology of the site. Neptunium sorbed well onto calcite and calcite-rich tuffs, indicating that a significant amount of neptunium retardation can be expected under fractured-flow scenarios because of calcite coating of the fractures. Neptunium sorption onto clinoptilolite-rich zeoliticmore » tuffs in J-13 well water (pH from 7 to 8.5) was moderate, increased with decreasing pH, and correlated to surface area and amount of clinoptilolite. Neptunium sorbed poorly onto zeolitic tuffs from UE-25 p No.1 groundwater (pH from 7 to 9) and onto devitrified and vitric tuffs from J-13 and UE-25 p No.1 waters (pH from 7 to 9). Iron oxides appeared to be passivated in tuffs, not seeming to contribute to the observed neptunium sorption, even though neptunium sorption onto synthetic iron oxide is significant.« less
EnviroNET: On-line information for LDEF
NASA Technical Reports Server (NTRS)
Lauriente, Michael
1993-01-01
EnviroNET is an on-line, free-form database intended to provide a centralized repository for a wide range of technical information on environmentally induced interactions of use to Space Shuttle customers and spacecraft designers. It provides a user-friendly, menu-driven format on networks that are connected globally and is available twenty-four hours a day - every day. The information, updated regularly, includes expository text, tabular numerical data, charts and graphs, and models. The system pools space data collected over the years by NASA, USAF, other government research facilities, industry, universities, and the European Space Agency. The models accept parameter input from the user, then calculate and display the derived values corresponding to that input. In addition to the archive, interactive graphics programs are also available on space debris, the neutral atmosphere, radiation, magnetic fields, and the ionosphere. A user-friendly, informative interface is standard for all the models and includes a pop-up help window with information on inputs, outputs, and caveats. The system will eventually simplify mission analysis with analytical tools and deliver solutions for computationally intense graphical applications to do 'What if...' scenarios. A proposed plan for developing a repository of information from the Long Duration Exposure Facility (LDEF) for a user group is presented.
Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K. Sreekumaran; Sumner, Susan; Subramaniam, Shankar
2016-01-01
The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476
Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet
NASA Astrophysics Data System (ADS)
Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen
2016-04-01
Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.
Using CASE to Adopt Organizational Learning at NASA
NASA Technical Reports Server (NTRS)
Templeton, Gary F.
2003-01-01
The research direction was articulated in a statement of work created in collaboration between two program colleagues, an outside researcher and an internal user. The researcher was to deliver an implemented CASE tool (CasewiseTM) that was to be used to serve non-traditional (i.e., not software development related) organizational purposes. The explicitly stated functions of the tool were the support of 1) ISO-9000 compliance in the documentation of processes and 2) the management of process improvement. The collaborative team consisted of the researcher (GT), a full-time accompanying student (CRO), and the user (JD). The team originally focused on populating the CASE repository for the purpose of solving the two primary objectives. Consistent with the action research approach, several additional user requirements emerged as the project evolved, needs became apparent in discussions about how the tool would be used to solve organizational problems. These deliverables were contained within the CASE repository: 1) the creation of a paradigm diagram 2) the creation of a context diagram 3) the creation of child diagrams 4) the generation of 73 issues relating to organizational change 5) a compendium of stakeholder interview transcripts All record keeping was done manually and then keyed into the CASE interface. An issue is the difference between an organization s current situation (action) and its collective ideals.
NASA Biological Specimen Repository
NASA Technical Reports Server (NTRS)
Pietrzyk, Robert; McMonigal, K. A.; Sams, C. F.; Johnson, M. A.
2009-01-01
The NASA Biological Specimen Repository (NBSR) has been established to collect, process, annotate, store, and distribute specimens under the authority of the NASA/JSC Committee for the Protection of Human Subjects. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The NBSR is a secure controlled storage facility that is used to maintain biological specimens over extended periods of time, under well-controlled conditions, for future use in approved human spaceflight-related research protocols. The repository supports the Human Research Program, which is charged with identifying and investigating physiological changes that occur during human spaceflight, and developing and implementing effective countermeasures when necessary. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can validate clinical hypotheses, study space-flight related changes, and investigate physiological markers All samples collected require written informed consent from each long duration crewmember. The NBSR collects blood and urine samples from all participating long duration ISS crewmembers. These biological samples are collected pre-flight at approximately 45 days prior to launch, during flight on flight days 15, 30, 60 120 and within 2 weeks of landing. Postflight sessions are conducted 3 and 30 days following landing. The number of inflight sessions is dependent on the duration of the mission. Operations began in 2007 and as of October 2009, 23 USOS crewmembers have completed or agreed to participate in this project. As currently planned, these human biological samples will be collected from crewmembers covering multiple ISS missions until the end of U.S. presence on the ISS or 2017. The NBSR will establish guidelines for sample distribution that are consistent with ethical principles, protection of crewmember confidentiality, prevailing laws and regulations, intellectual property policies, and consent form language. A NBSR Advisory Board composed of representatives of all participating agencies will be established to evaluate each request by an investigator for use of the samples to ensure the request reflects the mission of the NBSR.
NASA Astrophysics Data System (ADS)
Lafuente, B.; Stone, N.; Bristow, T.; Keller, R. M.; Blake, D. F.; Downs, R. T.; Pires, A.; Dateo, C. E.; Fonda, M.
2017-12-01
In development for nearly four years, the Open Data Repository's (ODR) Data Publisher software has become a useful tool for researchers' data needs. Data Publisher facilitates the creation of customized databases with flexible permission sets that allow researchers to share data collaboratively while improving data discovery and maintaining ownership rights. The open source software provides an end-to-end solution from collection to final repository publication. A web-based interface allows researchers to enter data, view data, and conduct analysis using any programming language supported by JupyterHub (http://www.jupyterhub.org). This toolset makes it possible for a researcher to store and manipulate their data in the cloud from any internet capable device. Data can be embargoed in the system until a date selected by the researcher. For instance, open publication can be set to a date that coincides with publication of data analysis in a third party journal. In conjunction with teams at NASA Ames and the University of Arizona, a number of pilot studies are being conducted to guide the software development so that it allows them to publish and share their data. These pilots include (1) the Astrobiology Habitable Environments Database (AHED), a central searchable repository designed to promote and facilitate the integration and sharing of all the data generated by the diverse disciplines in astrobiology; (2) a database containing the raw and derived data products from the CheMin instrument on the MSL rover Curiosity (http://odr.io/CheMin), featuring a versatile graphing system, instructions and analytical tools to process the data, and a capability to download data in different formats; and (3) the Mineral Evolution project, which by correlating the diversity of mineral species with their ages, localities, and other measurable properties aims to understand how the episodes of planetary accretion and differentiation, plate tectonics, and origin of life lead to a selective evolution of mineral species through changes in temperature, pressure, and composition. Ongoing development will complete integration of third party meta-data standards and publishing data to the semantic web. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL.
Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment
NASA Astrophysics Data System (ADS)
Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.
2017-12-01
The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.SAND2017-8198A.
NASA Astrophysics Data System (ADS)
Gaich, A.; Deák, F.; Pötsch, M.
2012-12-01
The Hungarian National Radioactive Waste Repository is being built in the neighborhood of the village called Bátaapáti. The program of the new disposal facility for the low- and intermediate-level wastes (L/ILW) is conducted by PURAM (Public Limited Company for Radioactive Waste Management). The Bátaapáti underground research program began in February 2005, with the excavation of the two inclined exploratory tunnels. These tunnels have 30 m distance between their axes, 10% inclination and 1.7 km length, and have reached the 0 m Baltic sea-level in the Mórágy Granite Formation. The safety of nuclear repository mainly is influenced by the ground behaviour and its fracturing hence mapping of the geological features has a great importance. Because of the less stable ground, the cavern walls were shotcreted after every tunnelling advance. The site geologists were required to make the tunnel mapping after every drill and blast cycle. The time interval was short and the documenting work was unrepeatable due to the shotcrete supported walls, so it was very important to use a modern, precise system to create 3D photorealistic models of the rock surfaces on the excavated tunnel walls. We have chosen the photogrammetric method, because it has adequate resolution and quality for the photo combined 3D models. At the beginning, we had used the JointMetriX3D (JMX) system and subsequently ShapeMetriX3D (SMX) in the repository chamber excavation phase. From the acquired 3D images through geological mapping is performed as the system allows directly measuring geometric information on visible discontinuities such as dip and dip direction. Descriptive rock mass parameters such as spacing, area, roughness are instantly available. In this article we would like to continue that research having made by JMX model of a tunnel face of "TSZV" access tunnel and using SMX model of a tunnel face from "DEK" Chamber. Our studies were carried out by field engineering geologists on further investigation of the photorealistic 3D models reproducibility in the both cases JMX and SMX. Regularly geotechnical rock mass classifications (Q, RMR and GSI) were used on the basis of the 3D models without field experience of the given tunnel faces. All documentations were analysed with statistical methods considering the circumstances of scanning and picturing. The orientation of main characteristic discontinuities were defined by each geologist, but also some differences occured. These discrepancies had not occurred in the results of geotechnical evaluation. Due to several cases the information provided by the 3D modelling systems could be very useful in different phases of excavation works. These information were applied in geoscience researches for example in surface roughness determination, fracture system modelling of the host rock and geological or technical objects findings behind the shotcrete layer. Beside the above mentioned advanteges we have to emphasize that JMX and SMX systems provide contact free acqusition and assessment of rock and terrain surfaces by metric high resolution 3D images in very short time period.
Nicephor[e]: a web-based solution for teaching forensic and scientific photography.
Voisard, R; Champod, C; Furrer, J; Curchod, J; Vautier, A; Massonnet, G; Buzzini, P
2007-04-11
Nicephor[e] is a project funded by "Swiss Virtual Campus" and aims at creating a distant or mixed web-based learning system in forensic and scientific photography and microscopy. The practical goal is to organize series of on-line modular courses corresponding to the educational requirements of undergraduate academic programs. Additionally, this program could be used in the context of continuing educational programs. The architecture of the project is designed to guarantee a high level of knowledge in forensic and scientific photographic techniques, and to have an easy content production and the ability to create a number of different courses sharing the same content. The e-learning system Nicephor[e] consists of three different parts. The first one is a repository of learning objects that gathers all theoretical subject matter of the project such as texts, animations, images, and films. This repository is a web content management system (Typo3) that permits creating, publishing, and administrating dynamic content via a web browser as well as storing it into a database. The flexibility of the system's architecture allows for an easy updating of the content to follow the development of photographic technology. The instructor of a course can decide which modular contents need to be included in the course, and in which order they will be accessed by students. All the modular courses are developed in a learning management system (WebCT or Moodle) that can deal with complex learning scenarios, content distribution, students, tests, and interaction with instructor. Each course has its own learning scenario based on the goals of the course and the student's profile. The content of each course is taken from the content management system. It is then structured in the learning management system according to the pedagogical goals defined by the instructor. The modular courses are created in a highly interactive setting and offer autoevaluating tests to the students. The last part of the system is a digital assets management system (Extensis Portfolio). The practical portion of each course is to produce images of different marks or objects. The collection of all this material produced, indexed by the students and corrected by the instructor is essential to the development of a knowledge base of photographic techniques applied to a specific forensic subject. It represents also an extensible collection of different marks from known sources obtained under various conditions. It allows to reuse these images for creating image-based case files.
Benefits of International Collaboration on the International Space Station
NASA Technical Reports Server (NTRS)
Robinson, Julie A.; Hasbrook, Pete; Tate Brown, Judy; Thumm, Tracy; Cohen, Luchino; Marcil, Isabelle; De Parolis, Lina; Hatton, Jason; Umezawa, Kazuo; Shirakawa, Masaki;
2017-01-01
The International Space Station is a valuable platform for research in space, but the benefits are limited if research is only conducted by individual countries. Through the e orts of the ISS Program Science Forum, international science working groups, and interagency cooperation, international collaboration on the ISS has expanded as ISS utilization has matured. Members of science teams benefit from working with counterparts in other countries. Scientists and institutions bring years of experience and specialized expertise to collaborative investigations, leading to new perspectives and approaches to scientific challenges. Combining new ideas and historical results brings synergy and improved peer-reviewed scientific methods and results. World-class research facilities can be expensive and logistically complicated, jeopardizing their full utilization. Experiments that would be prohibitively expensive for a single country can be achieved through contributions of resources from two or more countries, such as crew time, up- and downmass, and experiment hardware. Cooperation also avoids duplication of experiments and hardware among agencies. Biomedical experiments can be completed earlier if astronauts or cosmonauts from multiple agencies participate. Countries responding to natural disasters benefit from ISS imagery assets, even if the country has no space agency of its own. Students around the world participate in ISS educational opportunities, and work with students in other countries, through open curriculum packages and through international competitions. Even experiments conducted by a single country can benefit scientists around the world, through specimen sharing programs and publicly accessible \\open data" repositories. For ISS data, these repositories include GeneLab, the Physical Science Informatics System, and different Earth science data systems. Scientists can conduct new research using ISS data without having to launch and execute their own experiments. Multilateral collections of research results publications, maintained by the ISS international partnership and accessible via nasa.gov, make ISS results available worldwide, and encourage new users, ideas and research. The paper explores effectiveness of international collaboration in the course of the ISS Program execution. The collaboration history, its evolution and maturation, change of focus during its different phases, and growth of its effectiveness (in accordance with the especially established criteria) are also considered in the paper in the light of benefits for the entire ISS community. With the International Space Station extended through at least 2024, more crew time becoming available and new facilities arriving on board the ISS, these benefits of international scientific collaboration on the ISS can only increase.
Software Tools Streamline Project Management
NASA Technical Reports Server (NTRS)
2009-01-01
Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query-Based Document Management (QBDM) is a tool that enables content or context searches, either simple or hierarchical, across a variety of databases. The system enables users to specify notification subscriptions where they associate "contexts of interest" and "events of interest" to one or more documents or collection(s) of documents. Based on these subscriptions, users receive notification when the events of interest occur within the contexts of interest for associated document or collection(s) of documents. Users can also associate at least one notification time as part of the notification subscription, with at least one option for the time period of notifications.
The GLOBE Program: Partnerships in Action
NASA Astrophysics Data System (ADS)
Henderson, S.; Kennedy, T.; Lemone, M.; Blurton, C.
2004-12-01
The GLOBE Program is a worldwide science and education partnership endeavor designed to increase scientific understanding of Earth as a system, support improved student achievement in science and math, and enhance environmental awareness through inquiry-based learning activities. GLOBE began on the premise that teachers and their students would partner with scientists to collect and analyze environmental data using specific protocols in five study areas - atmosphere, soils, hydrology, land cover, and phenology. As the GLOBE network grew, additional partnerships flourished making GLOBE an unprecedented collaboration of individuals worldwide - primary, secondary, and tertiary students, teachers and teacher educators, scientists, government officials, and others - to improve K-12 education. Since its inception in 1994, more than one million students in over 14,000 schools around the world have taken part in The GLOBE Program. The GLOBE Web site (http://www.globe.gov) is the repository for over 11 million student-collected data measurements easily accessible to students and scientists worldwide. Utilizing the advantages of the Internet for information sharing and communication, GLOBE has created an international community. GLOBE enriches students by giving them the knowledge and skills that they will need to become informed citizens and responsible decision-makers in an increasingly complex world. Understanding that all members of a community must support change if it is to be sustainable, GLOBE actively encourages the development of GLOBE Learning Communities (GLCs) which are designed to get diverse stakeholder groups involved in a local or regional environmental issue. Central to the GLC is the engagement of local schools. GLCs go beyond individual teachers implementing GLOBE in the isolation of their classrooms. Instead, the GLC brings multiple teachers and grade levels together to examine environmental issues encouraging the participation of a broad range of community members who share a common commitment to supporting teachers and students in the implementation of GLOBE for the benefit of their community. A GLC might begin as a GLOBE Partner based at a university works with teachers and students from primary and secondary schools in the local school district, and then branches out to include parents, youth clubs, scientists, senior citizens, other colleges and universities, daycare centers, museums, businesses, government agencies and more. In the past decade, as the variety and diversity of partnerships within the GLOBE Program expanded, lessons have been learned that may be of use to other programs intent on implementing partnership programs to sustain systemic changes in K-12 Earth Science Education. This presentation will chronicle the GLOBE journey including results of annual program evaluations.
Creating a learning organization to help meet the needs of multihospital health systems.
Ward, Angela; Berensen, Nannette; Daniels, Rowell
2018-04-01
The considerations that leaders of multihospital health systems must take into account in developing and implementing initiatives to build and maintain an exceptional pharmacy workforce are described. Significant changes that require constant individual and organizational learning are occurring throughout healthcare and within the profession of pharmacy. These considerations include understanding why it is important to have a succession plan and determining what types of education and training are important to support that plan. Other considerations include strategies for leveraging learners, dealing with a large geographic footprint, adjusting training opportunities to accommodate the ever-evolving demands on pharmacy staffs in terms of skill mix, and determining ways to either budget for or internally develop content for staff development. All of these methods are critically important to ensuring an optimized workforce. Especially for large health systems operating multiple sites across large distances, the use of technology-enabled solutions to provide effective delivery of programming to multiple sites is critical. Commonly used tools include live webinars, live "telepresence" programs, prerecorded programming that is available through an on-demand repository, and computer-based training modules. A learning management system is helpful to assign and document completion of educational requirements, especially those related to regulatory requirements (e.g., controlled substances management, sterile and nonsterile compounding, competency assessment). Creating and sustaining an environment where all pharmacy caregivers feel invested in and connected to ongoing learning is a powerful motivator for performance, engagement, and retention. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Disseminating Innovations in Teaching Value-Based Care Through an Online Learning Network.
Gupta, Reshma; Shah, Neel T; Moriates, Christopher; Wallingford, September; Arora, Vineet M
2017-08-01
A national imperative to provide value-based care requires new strategies to teach clinicians about high-value care. We developed a virtual online learning network aimed at disseminating emerging strategies in teaching value-based care. The online Teaching Value in Health Care Learning Network includes monthly webinars that feature selected innovators, online discussion forums, and a repository for sharing tools. The learning network comprises clinician-educators and health system leaders across North America. We conducted a cross-sectional online survey of all webinar presenters and the active members of the network, and we assessed program feasibility. Six months after the program launched, there were 277 learning community members in 22 US states. Of the 74 active members, 50 (68%) completed the evaluation. Active members represented independently practicing physicians and trainees in 7 specialties, nurses, educators, and health system leaders. Nearly all speakers reported that the learning network provided them with a unique opportunity to connect with a different audience and achieve greater recognition for their work. Of the members who were active in the learning network, most reported that strategies gleaned from the network were helpful, and some adopted or adapted these innovations at their home institutions. One year after the program launched, the learning network had grown to 364 total members. The learning network helped participants share and implement innovations to promote high-value care. The model can help disseminate innovations in emerging areas of health care transformation, and is sustainable without ongoing support after a period of start-up funding.
A Tool to Simulate the Transmission, Reception, and Execution of Interactive TV Applications
Kulesza, Raoni; Rodrigues, Thiago; Machado, Felipe A. L.; Santos, Celso A. S.
2017-01-01
The emergence of Interactive Digital Television (iDTV) opened a set of technological possibilities that go beyond those offered by conventional TV. Among these opportunities we can highlight interactive contents that run together with linear TV program (television service where the viewer has to watch a scheduled TV program at the particular time it is offered and on the particular channel it is presented on). However, developing interactive contents for this new platform is not as straightforward as, for example, developing Internet applications. One of the options to make this development process easier and safer is to use an iDTV simulator. However, after having investigated some of the existing iDTV simulation environments, we have found a limitation: these simulators mainly present solutions focused on the TV receiver, whose interactive content must be loaded in advance by the programmer to a local repository (e.g., Hard Drive, USB). Therefore, in this paper, we propose a tool, named BiS (Broadcast iDTV content Simulator), which makes possible a broader solution for the simulation of interactive contents. It allows simulating the transmission of interactive content along with the linear TV program (simulating the transmission of content over the air and in broadcast to the receivers). To enable this, we defined a generic and easy-to-customize communication protocol that was implemented in the tool. The proposed environment differs from others because it allows simulating reception of both linear content and interactive content while running Java applications to allow such a content presentation. PMID:28280770
Chan, R V Paul; Patel, Samir N; Ryan, Michael C; Jonas, Karyn E; Ostmo, Susan; Port, Alexander D; Sun, Grace I; Lauer, Andreas K; Chiang, Michael F
2015-01-01
To describe the design, implementation, and evaluation of a tele-education system developed to improve diagnostic competency in retinopathy of prematurity (ROP) by ophthalmology residents. A secure Web-based tele-education system was developed utilizing a repository of over 2,500 unique image sets of ROP. For each image set used in the system, a reference standard ROP diagnosis was established. Performance by ophthalmology residents (postgraduate years 2 to 4) from the United States and Canada in taking the ROP tele-education program was prospectively evaluated. Residents were presented with image-based clinical cases of ROP during a pretest, posttest, and training chapters. Accuracy and reliability of ROP diagnosis (eg, plus disease, zone, stage, category) were determined using sensitivity, specificity, and the kappa statistic calculations of the results from the pretest and posttest. Fifty-five ophthalmology residents were provided access to the ROP tele-education program. Thirty-one ophthalmology residents completed the program. When all training levels were analyzed together, a statistically significant increase was observed in sensitivity for the diagnosis of plus disease, zone, stage, category, and aggressive posterior ROP (P<.05). Statistically significant changes in specificity for identification of stage 2 or worse (P=.027) and pre-plus (P=.028) were observed. A tele-education system for ROP education is effective in improving diagnostic accuracy of ROP by ophthalmology residents. This system may have utility in the setting of both healthcare and medical education reform by creating a validated method to certify telemedicine providers and educate the next generation of ophthalmologists.
A Framework for Integrating Oceanographic Data Repositories
NASA Astrophysics Data System (ADS)
Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.
2010-12-01
Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.
Evaluation of a 6-wire thermocouple psychrometer for determination of in-situ water potentials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loskot, C.L.; Rousseau, J.P.; Kurzmack, M.A.
1994-12-31
The US Geological Survey has been conducting investigations at Yucca Mountain, Nevada, to provide information about the hydrologic and geologic suitability of this site for storing high-level nuclear wastes in an underground mined repository. Test drilling and instrumentation are a principal method of investigation. The main objectives of the deep unsaturated-zone testhole program are: (1) to determine the flux of water moving through the unsaturated welded and nonwelded tuff units, (2) to determine the vertical and lateral distribution of moisture content, water potential, and other important geohydrologic characteristics in the rock units penetrated, and (3) to monitor stability and changesmore » in in-situ fluid potentials with time. Thermocouple psychrometers will be used to monitor in-situ water potentials.« less
Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1996-01-01
This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.
Biosecurity and Health Monitoring at the Zebrafish International Resource Center
Varga, Zoltán M.; Kent, Michael L.
2016-01-01
Abstract The Zebrafish International Resource Center (ZIRC) is a repository and distribution center for mutant, transgenic, and wild-type zebrafish. In recent years annual imports of new zebrafish lines to ZIRC have increased tremendously. In addition, after 15 years of research, we have identified some of the most virulent pathogens affecting zebrafish that should be avoided in large production facilities, such as ZIRC. Therefore, while importing a high volume of new lines we prioritize safeguarding the health of our in-house fish colony. Here, we describe the biosecurity and health-monitoring program implemented at ZIRC. This strategy was designed to prevent introduction of new zebrafish pathogens, minimize pathogens already present in the facility, and ensure a healthy zebrafish colony for in-house uses and shipment to customers. PMID:27031282
Corrosion Management of the Hanford High-Level Nuclear Waste Tanks
NASA Astrophysics Data System (ADS)
Beavers, John A.; Sridhar, Narasi; Boomer, Kayle D.
2014-03-01
The Hanford site is located in southeastern Washington State and stores more than 200,000 m3 (55 million gallons) of high-level radioactive waste resulting from the production and processing of plutonium. The waste is stored in large carbon steel tanks that were constructed between 1943 and 1986. The leak and structurally integrity of the more recently constructed double-shell tanks must be maintained until the waste can be removed from the tanks and encapsulated in glass logs for final disposal in a repository. There are a number of corrosion-related threats to the waste tanks, including stress-corrosion cracking, pitting corrosion, and corrosion at the liquid-air interface and in the vapor space. This article summarizes the corrosion management program at Hanford to mitigate these threats.
Improved Access to NSF Funded Ocean Research Data
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Groman, R. C.; Kinkade, D.; Shepherd, A.; Rauch, S.; Allison, M. D.; Gegg, S. R.; Wiebe, P. H.; Glover, D. M.
2015-12-01
Data from NSF-funded, hypothesis-driven research comprise an essential part of the research results upon which we base our knowledge and improved understanding of the impacts of climate change. Initially funded in 2006, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) works with marine scientists to ensure that data from NSF-funded ocean research programs are fully documented and freely available for future use. BCO-DMO works in partnership with information technology professionals, other marine data repositories and national data archive centers to ensure long-term preservation of these valuable environmental research data. Data contributed to BCO-DMO by the original investigators are enhanced with sufficient discipline-specific documentation and published in a variety of standards-compliant forms designed to enable discovery and support accurate re-use.
Configuration Management File Manager Developed for Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
Follen, Gregory J.
1997-01-01
One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.
Development of intelligent semantic search system for rubber research data in Thailand
NASA Astrophysics Data System (ADS)
Kaewboonma, Nattapong; Panawong, Jirapong; Pianhanuruk, Ekkawit; Buranarach, Marut
2017-10-01
The rubber production of Thailand increased not only by strong demand from the world market, but was also stimulated strongly through the replanting program of the Thai Government from 1961 onwards. With the continuous growth of rubber research data volume on the Web, the search for information has become a challenging task. Ontologies are used to improve the accuracy of information retrieval from the web by incorporating a degree of semantic analysis during the search. In this context, we propose an intelligent semantic search system for rubber research data in Thailand. The research methods included 1) analyzing domain knowledge, 2) ontologies development, and 3) intelligent semantic search system development to curate research data in trusted digital repositories may be shared among the wider Thailand rubber research community.
Downgrade of the Savannah River Sites FB-Line
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADOWSKI, ED; YOURCHAK, RANDY; PRETZELLO MARJI
2005-07-05
This paper will discuss the Safeguards & Security (S&S) activities that resulted in the downgrade of the Savannah River Site's FB-Line (FBL) from a Category I Material Balance Area (MBA) in a Material Access Area (MAA) to a Category IV MBA in a Property Protection Area (PPA). The Safeguards activities included measurement of final product items, transferal of nuclear material to other Savannah River Site (SRS) facilities, discard of excess nuclear material items, and final measurements of holdup material. The Security activities included relocation and destruction of classified documents and repositories, decertification of a classified computer, access control changes, updatesmore » to planning documents, deactivation and removal of security systems, Human Reliability Program (HRP) removals, and information security training for personnel that will remain in the FBL PPA.« less
Army Energy and Water Reporting System Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deprez, Peggy C.; Giardinelli, Michael J.; Burke, John S.
There are many areas of desired improvement for the Army Energy and Water Reporting System. The purpose of system is to serve as a data repository for collecting information from energy managers, which is then compiled into an annual energy report. This document summarizes reported shortcomings of the system and provides several alternative approaches for improving application usability and adding functionality. The U.S. Army has been using Army Energy and Water Reporting System (AEWRS) for many years to collect and compile energy data from installations for facilitating compliance with Federal and Department of Defense energy management program reporting requirements. Inmore » this analysis, staff from Pacific Northwest National Laboratory found that substantial opportunities exist to expand AEWRS functions to better assist the Army to effectively manage energy programs. Army leadership must decide if it wants to invest in expanding AEWRS capabilities as a web-based, enterprise-wide tool for improving the Army Energy and Water Management Program or simply maintaining a bottom-up reporting tool. This report looks at both improving system functionality from an operational perspective and increasing user-friendliness, but also as a tool for potential improvements to increase program effectiveness. The authors of this report recommend focusing on making the system easier for energy managers to input accurate data as the top priority for improving AEWRS. The next major focus of improvement would be improved reporting. The AEWRS user interface is dated and not user friendly, and a new system is recommended. While there are relatively minor improvements that could be made to the existing system to make it easier to use, significant improvements will be achieved with a user-friendly interface, new architecture, and a design that permits scalability and reliability. An expanded data set would naturally have need of additional requirements gathering and a focus on integrating with other existing data sources, thus minimizing manually entered data.« less
Smith, Matthew Lee; Towne, Samuel D; Herrera-Venson, Angelica; Cameron, Kathleen; Kulinski, Kristie P; Lorig, Kate; Horel, Scott A; Ory, Marcia G
2017-06-14
Background : Alongside the dramatic increase of older adults in the United States (U.S.), it is projected that the aging population residing in rural areas will continue to grow. As the prevalence of chronic diseases and multiple chronic conditions among adults continues to rise, there is additional need for evidence-based interventions to assist the aging population to improve lifestyle behaviors, and self-manage their chronic conditions. The purpose of this descriptive study was to identify the geospatial dissemination of Chronic Disease Self-Management Education (CDSME) Programs across the U.S. in terms of participants enrolled, workshops delivered, and counties reached. These dissemination characteristics were compared across rurality designations (i.e., metro areas; non-metro areas adjacent to metro areas, and non-metro areas not adjacent to metro areas). Methods : This descriptive study analyzed data from a national repository including efforts from 83 grantees spanning 47 states from December 2009 to December 2016. Counts were tabulated and averages were calculated. Results : CDSME Program workshops were delivered in 56.4% of all U.S. counties one or more times during the study period. Of the counties where a workshop was conducted, 50.5% were delivered in non-metro areas. Of the 300,640 participants enrolled in CDSME Programs, 12% attended workshops in non-metro adjacent areas, and 7% attended workshops in non-metro non-adjacent areas. The majority of workshops were delivered in healthcare organizations, senior centers/Area Agencies on Aging, and residential facilities. On average, participants residing in non-metro areas had better workshop attendance and retention rates compared to participants in metro areas. Conclusions : Findings highlight the established role of traditional organizations/entities within the aging services network, to reach remote areas and serve diverse participants (e.g., senior centers). To facilitate growth in rural areas, technical assistance will be needed. Additional efforts are needed to bolster partnerships (e.g., sharing resources and knowledge), marketing (e.g., tailored material), and regular communication among stakeholders.