DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Nathalie A.; Neeway, James J.; Qafoku, Nikolla P.
2015-09-30
Assessments of waste form and disposal options start with the degradation of the waste forms and consequent mobilization of radionuclides. Long-term static tests, single-pass flow-through tests, and the pressurized unsaturated flow test are often employed to study the durability of potential waste forms and to help create models that predict their durability throughout the lifespan of the disposal site. These tests involve the corrosion of the material in the presence of various leachants, with different experimental designs yielding desired information about the behavior of the material. Though these tests have proved instrumental in elucidating various mechanisms responsible for material corrosion,more » the chemical environment to which the material is subject is often not representative of a potential radioactive waste repository where factors such as pH and leachant composition will be controlled by the near-field environment. Near-field materials include, but are not limited to, the original engineered barriers, their resulting corrosion products, backfill materials, and the natural host rock. For an accurate performance assessment of a nuclear waste repository, realistic waste corrosion experimental data ought to be modeled to allow for a better understanding of waste form corrosion mechanisms and the effect of immediate geochemical environment on these mechanisms. Additionally, the migration of radionuclides in the resulting chemical environment during and after waste form corrosion must be quantified and mechanisms responsible for migrations understood. The goal of this research was to understand the mechanisms responsible for waste form corrosion in the presence of relevant repository sediments to allow for accurate radionuclide migration quantifications. The rationale for this work is that a better understanding of waste form corrosion in relevant systems will enable increased reliance on waste form performance in repository environments and potentially decrease the need for expensive engineered barriers.Our current work aims are 1) quantifying and understanding the processes associated with glass alteration in contact with Fe-bearing materials; 2) quantifying and understanding the processes associated with glass alteration in presence of MgO (example of engineered barrier used in WIPP); 3) identifying glass alteration suppressants and the processes involved to reach glass alteration suppression; 4) quantifying and understanding the processes associated with Saltstone and Cast Stone (SRS and Hanford cementitious waste forms) in various representative groundwaters; 5) investigating positron annihilation as a new tool for the study of glass alteration; and 6) quantifying and understanding the processes associated with glass alteration under gamma irradiation.« less
ERIC Educational Resources Information Center
Shor, Mikhael
2003-01-01
States making game theory relevant and accessible to students is challenging. Describes the primary goal of GameTheory.net is to provide interactive teaching tools. Indicates the site strives to unite educators from economics, political and computer science, and ecology by providing a repository of lecture notes and tests for courses using…
Relevant Repositories of Public Knowledge? Libraries, Museums and Archives in "The Information Age"
ERIC Educational Resources Information Center
Usherwood, Bob; Wilson, Kerry; Bryson, Jared
2005-01-01
In a project funded by the AHRB, researchers at the University of Sheffield used a combination of quantitative and qualitative research methods to examine the perceived contemporary relevance of archives, libraries and museums. The research sought to discern how far the British people value access to these established repositories of public…
Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschaert, S.; Lesoille, S.; Bertrand, J.
2012-07-01
The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less
Analysis of model output and science data in the Virtual Model Repository (VMR).
NASA Astrophysics Data System (ADS)
De Zeeuw, D.; Ridley, A. J.
2014-12-01
Big scientific data not only includes large repositories of data from scientific platforms like satelites and ground observation, but also the vast output of numerical models. The Virtual Model Repository (VMR) provides scientific analysis and visualization tools for a many numerical models of the Earth-Sun system. Individual runs can be analyzed in the VMR and compared to relevant data through relevant metadata, but larger collections of runs can also now be studied and statistics generated on the accuracy and tendancies of model output. The vast model repository at the CCMC with over 1000 simulations of the Earth's magnetosphere was used to look at overall trends in accuracy when compared to satelites such as GOES, Geotail, and Cluster. Methodology for this analysis as well as case studies will be presented.
Retrieval analysis of motion preserving spinal devices and periprosthetic tissues
Kurtz, Steven M.; Steinbeck, Marla; Ianuzzi, Allyson; van Ooij, André; Punt, Ilona M.; Isaza, Jorge; Ross, E.R.S.
2009-01-01
This article reviews certain practical aspects of retrieval analysis for motion preserving spinal implants and periprosthetic tissues as an essential component of the overall revision strategy for these implants. At our institution, we established an international repository for motion-preserving spine implants in 2004. Our repository is currently open to all spine surgeons, and is intended to be inclusive of all cervical and lumbar implant designs such as artificial discs and posterior dynamic stabilization devices. Although a wide range of alternative materials is being investigated for nonfusion spine implants, many of the examples in this review are drawn from our existing repository of metal-on-polyethylene, metal-on-metal lumbar total disc replacements (TDRs), and polyurethane-based dynamic motion preservation devices. These devices are already approved or nearing approval for use in the United States, and hence are the most clinically relevant at the present time. This article summarizes the current literature on the retrieval analysis of these implants and concludes with recommendations for the development of new test methods that are based on the current state of knowledge of in vivo wear and damage mechanisms. Furthermore, the relevance and need to evaluate the surrounding tissue to obtain a complete understanding of the biological reaction to implant component corrosion and wear is reviewed. PMID:25802641
Steinkampf, W.C.
2000-01-01
Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program, The first day focuses on the regional seeing with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be in southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The filed trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, and element of the hydrologic system that historically has received little attention. Discussions during the second day will comprise selected topics of Yucca Mountain geology, mic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.
Preliminary safety evaluation of an aircraft impact on a near-surface radioactive waste repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo Frano, R.; Forasassi, G.; Pugliese, G.
2013-07-01
The aircraft impact accident has become very significant in the design of a nuclear facilities, particularly, after the tragic September 2001 event, that raised the public concern about the potential damaging effects that the impact of a large civilian airplane could bring in safety relevant structures. The aim of this study is therefore to preliminarily evaluate the global response and the structural effects induced by the impact of a military or commercial airplane (actually considered as a 'beyond design basis' event) into a near surface radioactive waste (RWs) disposal facility. The safety evaluation was carried out according to the Internationalmore » safety and design guidelines and in agreement with the stress tests requirements for the security track. To achieve the purpose, a lay out and a scheme of a possible near surface repository, like for example those of the El Cabril one, were taken into account. In order to preliminarily perform a reliable analysis of such a large-scale structure and to determine the structural effects induced by such a types of impulsive loads, a realistic, but still operable, numerical model with suitable materials characteristics was implemented by means of FEM codes. In the carried out structural analyses, the RWs repository was considered a 'robust' target, due to its thicker walls and main constitutive materials (steel and reinforced concrete). In addition to adequately represent the dynamic response of repository under crashing, relevant physical phenomena (i.e. penetration, spalling, etc.) were simulated and analysed. The preliminary assessment of the effects induced by the dynamic/impulsive loads allowed generally to verify the residual strength capability of the repository considered. The obtained preliminary results highlighted a remarkable potential to withstand the impact of military/large commercial aircraft, even in presence of ongoing concrete progressive failure (some penetration and spalling of the concrete wall) of the impacted area. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voelzke, Holger; Nieslony, Gregor; Ellouz, Manel
Since the license for the Konrad repository was finally confirmed by legal decision in 2007, the Federal Institute for Radiation Protection (BfS) has been performing further planning and preparation work to prepare the repository for operation. Waste conditioning and packaging has been continued by different waste producers as the nuclear industry and federal research institutes on the basis of the official disposal requirements. The necessary prerequisites for this are approved containers as well as certified waste conditioning and packaging procedures. The Federal Institute for Materials Research and Testing (BAM) is responsible for container design testing and evaluation of quality assurancemore » measures on behalf of BfS under consideration of the Konrad disposal requirements. Besides assessing the container handling stability (stacking tests, handling loads), design testing procedures are performed that include fire tests (800 deg. C, 1 hour) and drop tests from different heights and drop orientations. This paper presents the current state of BAM design testing experiences about relevant container types (box shaped, cylindrical) made of steel sheets, ductile cast iron or concrete. It explains usual testing and evaluation methods which range from experimental testing to analytical and numerical calculations. Another focus has been laid on already existing containers and packages. The question arises as to how they can be evaluated properly especially with respect to lack of completeness of safety assessment and fabrication documentation. At present BAM works on numerous applications for container design testing for the Konrad repository. Some licensing procedures were successfully finished in the past and BfS certified several container types like steel sheet, concrete until cast iron containers which are now available for waste packaging for final disposal. However, large quantities of radioactive wastes had been placed into interim storage using containers which are not already licensed for the Konrad repository. Safety assessment of these so-called 'old' containers is a big challenge for all parties because documentation sheets about container design testing and fabrication often contain gaps or have not yet been completed. Appropriate solution strategies are currently under development and discussion. Furthermore, BAM has successfully initiated and established an information forum, called 'ERFA QM Konrad Containers', which facilitates discussions on various issues of common interest with respect to Konrad container licensing procedures as well as the interpretation of disposal requirements under consideration of operational needs. Thus, it provides additional, valuable supports for container licensing procedures. (authors)« less
10 CFR 51.67 - Environmental information concerning geologic repositories.
Code of Federal Regulations, 2011 CFR
2011-01-01
... if it makes a substantial change in its proposed action that is relevant to environmental concerns or... 10 Energy 2 2011-01-01 2011-01-01 false Environmental information concerning geologic repositories. 51.67 Section 51.67 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) ENVIRONMENTAL PROTECTION...
Levich, R.A.; Linden, R.M.; Patterson, R.L.; Stuckless, J.S.
2000-01-01
Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program. The first day focuses on the regional setting with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The field trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, an element of the hydrologic system that historically has received little attention. Discussions during the second day will compromise selected topics of Yucca Mountain geology, hydrology and geochemistry and will include the probabilistic volcanic hazard analysis and the seismicity and seismic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the results of recent hydrologic studies by the Nye County Nuclear Waste Program Office, and the relationship of the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.
Role of natural analogs in performance assessment of nuclear waste repositories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sagar, B.; Wittmeyer, G.W.
1995-09-01
Mathematical models of the flow of water and transport of radionuclides in porous media will be used to assess the ability of deep geologic repositories to safely contain nuclear waste. These models must, in some sense, be validated to ensure that they adequately describe the physical processes occurring within the repository and its geologic setting. Inasmuch as the spatial and temporal scales over which these models must be applied in performance assessment are very large, validation of these models against laboratory and small-scale field experiments may be considered inadequate. Natural analogs may provide validation data that are representative of physico-chemicalmore » processes that occur over spatial and temporal scales as large or larger than those relevant to repository design. The authors discuss the manner in which natural analog data may be used to increase confidence in performance assessment models and conclude that, while these data may be suitable for testing the basic laws governing flow and transport, there is insufficient control of boundary and initial conditions and forcing functions to permit quantitative validation of complex, spatially distributed flow and transport models. The authors also express their opinion that, for collecting adequate data from natural analogs, resources will have to be devoted to them that are much larger than are devoted to them at present.« less
A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.
2013-07-01
The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less
Biological Web Service Repositories Review
Urdidiales‐Nieto, David; Navas‐Delgado, Ismael
2016-01-01
Abstract Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re‐execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re‐check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well‐known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. PMID:27783459
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swanson, Juliet S.; Cherkouk, Andrea; Arnold, Thuro
This report summarizes the potential role of microorganisms in salt-based nuclear waste repositories using available information on the microbial ecology of hypersaline environments, the bioenergetics of survival under high ionic strength conditions, and “repository microbiology” related studies. In areas where microbial activity is in question, there may be a need to shift the research focus toward feasibility studies rather than studies that generate actual input for performance assessments. In areas where activity is not necessary to affect performance (e.g., biocolloid transport), repository-relevant data should be generated. Both approaches will lend a realistic perspective to a safety case/performance scenario that willmore » most likely underscore the conservative value of that case.« less
Wang, Liqin; Haug, Peter J; Del Fiol, Guilherme
2017-05-01
Mining disease-specific associations from existing knowledge resources can be useful for building disease-specific ontologies and supporting knowledge-based applications. Many association mining techniques have been exploited. However, the challenge remains when those extracted associations contained much noise. It is unreliable to determine the relevance of the association by simply setting up arbitrary cut-off points on multiple scores of relevance; and it would be expensive to ask human experts to manually review a large number of associations. We propose that machine-learning-based classification can be used to separate the signal from the noise, and to provide a feasible approach to create and maintain disease-specific vocabularies. We initially focused on disease-medication associations for the purpose of simplicity. For a disease of interest, we extracted potentially treatment-related drug concepts from biomedical literature citations and from a local clinical data repository. Each concept was associated with multiple measures of relevance (i.e., features) such as frequency of occurrence. For the machine purpose of learning, we formed nine datasets for three diseases with each disease having two single-source datasets and one from the combination of previous two datasets. All the datasets were labeled using existing reference standards. Thereafter, we conducted two experiments: (1) to test if adding features from the clinical data repository would improve the performance of classification achieved using features from the biomedical literature only, and (2) to determine if classifier(s) trained with known medication-disease data sets would be generalizable to new disease(s). Simple logistic regression and LogitBoost were two classifiers identified as the preferred models separately for the biomedical-literature datasets and combined datasets. The performance of the classification using combined features provided significant improvement beyond that using biomedical-literature features alone (p-value<0.001). The performance of the classifier built from known diseases to predict associated concepts for new diseases showed no significant difference from the performance of the classifier built and tested using the new disease's dataset. It is feasible to use classification approaches to automatically predict the relevance of a concept to a disease of interest. It is useful to combine features from disparate sources for the task of classification. Classifiers built from known diseases were generalizable to new diseases. Copyright © 2017 Elsevier Inc. All rights reserved.
Report on International Collaboration Involving the FE Heater and HG-A Tests at Mont Terri
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houseworth, Jim; Rutqvist, Jonny; Asahina, Daisuke
Nuclear waste programs outside of the US have focused on different host rock types for geological disposal of high-level radioactive waste. Several countries, including France, Switzerland, Belgium, and Japan are exploring the possibility of waste disposal in shale and other clay-rich rock that fall within the general classification of argillaceous rock. This rock type is also of interest for the US program because the US has extensive sedimentary basins containing large deposits of argillaceous rock. LBNL, as part of the DOE-NE Used Fuel Disposition Campaign, is collaborating on some of the underground research laboratory (URL) activities at the Mont Terrimore » URL near Saint-Ursanne, Switzerland. The Mont Terri project, which began in 1995, has developed a URL at a depth of about 300 m in a stiff clay formation called the Opalinus Clay. Our current collaboration efforts include two test modeling activities for the FE heater test and the HG-A leak-off test. This report documents results concerning our current modeling of these field tests. The overall objectives of these activities include an improved understanding of and advanced relevant modeling capabilities for EDZ evolution in clay repositories and the associated coupled processes, and to develop a technical basis for the maximum allowable temperature for a clay repository.« less
Biological Web Service Repositories Review.
Urdidiales-Nieto, David; Navas-Delgado, Ismael; Aldana-Montes, José F
2017-05-01
Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re-execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re-check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well-known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Wei, Wei; Ji, Zhanglong; He, Yupeng; Zhang, Kai; Ha, Yuanchi; Li, Qi; Ohno-Machado, Lucila
2018-01-01
Abstract The number and diversity of biomedical datasets grew rapidly in the last decade. A large number of datasets are stored in various repositories, with different formats. Existing dataset retrieval systems lack the capability of cross-repository search. As a result, users spend time searching datasets in known repositories, and they typically do not find new repositories. The biomedical and healthcare data discovery index ecosystem (bioCADDIE) team organized a challenge to solicit new indexing and searching strategies for retrieving biomedical datasets across repositories. We describe the work of one team that built a retrieval pipeline and examined its performance. The pipeline used online resources to supplement dataset metadata, automatically generated queries from users’ free-text questions, produced high-quality retrieval results and achieved the highest inferred Normalized Discounted Cumulative Gain among competitors. The results showed that it is a promising solution for cross-database, cross-domain and cross-repository biomedical dataset retrieval. Database URL: https://github.com/w2wei/dataset_retrieval_pipeline PMID:29688374
ERIC Educational Resources Information Center
Park, Jung-ran; Yang, Chris; Tosaka, Yuji; Ping, Qing; Mimouni, Houda El
2016-01-01
This study is a part of the larger project that develops a sustainable digital repository of professional development resources on emerging data standards and technologies for data organization and management in libraries. Toward that end, the project team developed an automated workflow to crawl for, monitor, and classify relevant web objects…
10 CFR 60.44 - Changes, tests, and experiments.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following authorization to receive and possess source, special nuclear, or byproduct material at a geologic repository operations area, the DOE may (i) make changes in the geologic repository operations area as described in the...
Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio
2012-07-01
During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, Eric M.
2007-09-16
To predict the long-term fate of low- and high-level waste forms in the subsurface over geologic time scales, it is important to understand the behavior of the corroding waste forms under conditions the mimic to the open flow and transport properties of a subsurface repository. Fluidized bed steam reformation (FBSR), a supplemental treatment technology option, is being considered as a waste form for the immobilization of low-activity tank waste. To obtain the fundamental information needed to evaluate the behavior of the FBSR waste form under repository relevant conditions and to monitor the long-term behavior of this material, an accelerated weatheringmore » experiment is being conducted with the pressurized unsaturated flow (PUF) apparatus. Unlike other accelerated weathering test methods (product consistency test, vapor hydration test, and drip test), PUF experiments are conducted under hydraulically unsaturated conditions. These experiments are unique because they mimic the vadose zone environment and allow the corroding waste form to achieve its final reaction state. Results from this on-going experiment suggest the volumetric water content varied as a function of time and reached steady state after 160 days of testing. Unlike the volumetric water content, periodic excursions in the solution pH and electrical conductivity have been occurring consistently during the test. Release of elements from the column illustrates a general trend of decreasing concentration with increasing reaction time. Normalized concentrations of K, Na, P, Re (a chemical analogue for 99Tc), and S are as much as 1 × 104 times greater than Al, Cr, Si, and Ti. After more than 600 days of testing, the solution chemistry data collected to-date illustrate the importance of understanding the long-term behavior of the FBSR product under conditions that mimic the open flow and transport properties of a subsurface repository.« less
FY94 CAG trip reports, CAG memos and other products: Volume 2. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-12-15
The Yucca Mountain Site Characterization Project (YMP) of the US DOE is tasked with designing, constructing, and operating an Exploratory Studies Facility (ESF) at Yucca Mountain, Nevada. The purpose of the YMP is to provide detailed characterization of the Yucca Mountain site for the potential mined geologic repository for permanent disposal of high-level radioactive waste. Detailed characterization of properties of the site are to be conducted through a wide variety of short-term and long-term in-situ tests. Testing methods require the installation of a large number of test instruments and sensors with a variety of functions. These instruments produce analog andmore » digital data that must be collected, processed, stored, and evaluated in an attempt to predict performance of the repository. The Integrated Data and Control System (IDCS) is envisioned as a distributed data acquisition that electronically acquires and stores data from these test instruments. IDCS designers are responsible for designing and overseeing the procurement of the system, IDCS Operation and Maintenance operates and maintains the installed system, and the IDCS Data Manager is responsible for distribution of IDCS data to participants. This report is a compilation of trip reports, interoffice memos, and other memos relevant to Computer Applications Group, Inc., work on this project.« less
Bridging the Gap: Need for a Data Repository to Support Vaccine Prioritization Efforts*
Madhavan, Guruprasad; Phelps, Charles; Sangha, Kinpritma; Levin, Scott; Rappuoli, Rino
2015-01-01
As the mechanisms for discovery, development, and delivery of new vaccines become increasingly complex, strategic planning and priority setting have become ever more crucial. Traditional single value metrics such as disease burden or cost-effectiveness no longer suffice to rank vaccine candidates for development. The Institute of Medicine—in collaboration with the National Academy of Engineering—has developed a novel software system to support vaccine prioritization efforts. The Strategic Multi-Attribute Ranking Tool for Vaccines—SMART Vaccines—allows decision makers to specify their own value structure, selecting from among 28 pre-defined and up to 7 user-defined attributes relevant to the ranking of vaccine candidates. Widespread use of SMART Vaccines will require compilation of a comprehensive data repository for numerous relevant populations—including their demographics, disease burdens and associated treatment costs, as well as characterizing performance features of potential or existing vaccines that might be created, improved, or deployed. While the software contains preloaded data for a modest number of populations, a large gap exists between the existing data and a comprehensive data repository necessary to make full use of SMART Vaccines. While some of these data exist in disparate sources and forms, constructing a data repository will require much new coordination and focus. Finding strategies to bridge the gap to a comprehensive data repository remains the most important task in bringing SMART Vaccines to full fruition, and to support strategic vaccine prioritization efforts in general. PMID:26022565
Fluid geochemistry of Yucca Mountain and vicinity
Marshall, Brian D.; Moscati, Richard J.; Patterson, Gary L.; Stuckless, John S.
2012-01-01
Yucca Mountain, a site in southwest Nevada, has been proposed for a deep underground radioactive waste repository. An extensive database of geochemical and isotopic characteristics has been established for pore waters and gases from the unsaturated zone, perched water, and saturated zone waters in the Yucca Mountain area. The development of this database has been driven by diverse needs of the Yucca Mountain Project, especially those aspects of the project involving process modeling and performance assessment. Water and gas chemistries influence the sorption behavior of radionuclides and the solubility of the radionuclide compounds that form. The chemistry of waters that may infiltrate the proposed repository will be determined in part by that of water present in the unsaturated zone above the proposed repository horizon, whereas pore-water compositions beneath the repository horizon will influence the sorption behavior of the radionuclides transported toward the water table. However, more relevant to the discussion in this chapter, development and testing of conceptual flow and transport models for the Yucca Mountain hydrologic system are strengthened through the incorporation of natural environmental tracer data into the process. Chemical and isotopic data are used to establish bounds on key hydrologic parameters and to provide corroborative evidence for model assumptions and predictions. Examples of specific issues addressed by these data include spatial and temporal variability in net fluxes, the role of faults in controlling flow paths, fracture-matrix interactions, the age and origin of perched water, and the distribution of water traveltimes.
Preliminary safety analysis of the Baita Bihor radioactive waste repository, Romania
DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, Richard; Bond, Alex; Watson, Sarah
2007-07-01
A project funded under the European Commission's Phare Programme 2002 has undertaken an in-depth analysis of the operational and post-closure safety of the Baita Bihor repository. The repository has accepted low- and some intermediate-level radioactive waste from industry, medical establishments and research activities since 1985 and the current estimate is that disposals might continue for around another 20 to 35 years. The analysis of the operational and post-closure safety of the Baita Bihor repository was carried out in two iterations, with the second iteration resulting in reduced uncertainties, largely as a result taking into account new information on the hydrologymore » and hydrogeology of the area, collected as part of the project. Impacts were evaluated for the maximum potential inventory that might be available for disposal to Baita Bihor for a number of operational and postclosure scenarios and associated conceptual models. The results showed that calculated impacts were below the relevant regulatory criteria. In light of the assessment, a number of recommendations relating to repository operation, optimisation of repository engineering and waste disposals, and environmental monitoring were made. (authors)« less
Materials for Consideration in Standardized Canister Design Activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Charles R.; Ilgen, Anastasia Gennadyevna; Enos, David George
2014-10-01
This document identifies materials and material mitigation processes that might be used in new designs for standardized canisters for storage, transportation, and disposal of spent nuclear fuel. It also addresses potential corrosion issues with existing dual-purpose canisters (DPCs) that could be addressed in new canister designs. The major potential corrosion risk during storage is stress corrosion cracking of the weld regions on the 304 SS/316 SS canister shell due to deliquescence of chloride salts on the surface. Two approaches are proposed to alleviate this potential risk. First, the existing canister materials (304 and 316 SS) could be used, but themore » welds mitigated to relieve residual stresses and/or sensitization. Alternatively, more corrosion-resistant steels such as super-austenitic or duplex stainless steels, could be used. Experimental testing is needed to verify that these alternatives would successfully reduce the risk of stress corrosion cracking during fuel storage. For disposal in a geologic repository, the canister will be enclosed in a corrosion-resistant or corrosion-allowance overpack that will provide barrier capability and mechanical strength. The canister shell will no longer have a barrier function and its containment integrity can be ignored. The basket and neutron absorbers within the canister have the important role of limiting the possibility of post-closure criticality. The time period for corrosion is much longer in the post-closure period, and one major unanswered question is whether the basket materials will corrode slowly enough to maintain structural integrity for at least 10,000 years. Whereas there is extensive literature on stainless steels, this evaluation recommends testing of 304 and 316 SS, and more corrosion-resistant steels such as super-austenitic, duplex, and super-duplex stainless steels, at repository-relevant physical and chemical conditions. Both general and localized corrosion testing methods would be used to establish corrosion rates and component lifetimes. Finally, it is unlikely that the aluminum-based neutron absorber materials that are commonly used in existing DPCs would survive for 10,000 years in disposal environments, because the aluminum will act as a sacrificial anode for the steel. We recommend additional testing of borated and Gd-bearing stainless steels, to establish general and localized corrosion resistance in repository-relevant environmental conditions.« less
The Pig PeptideAtlas: A resource for systems biology in animal production and biomedicine.
Hesselager, Marianne O; Codrea, Marius C; Sun, Zhi; Deutsch, Eric W; Bennike, Tue B; Stensballe, Allan; Bundgaard, Louise; Moritz, Robert L; Bendixen, Emøke
2016-02-01
Biological research of Sus scrofa, the domestic pig, is of immediate relevance for food production sciences, and for developing pig as a model organism for human biomedical research. Publicly available data repositories play a fundamental role for all biological sciences, and protein data repositories are in particular essential for the successful development of new proteomic methods. Cumulative proteome data repositories, including the PeptideAtlas, provide the means for targeted proteomics, system-wide observations, and cross-species observational studies, but pigs have so far been underrepresented in existing repositories. We here present a significantly improved build of the Pig PeptideAtlas, which includes pig proteome data from 25 tissues and three body fluid types mapped to 7139 canonical proteins. The content of the Pig PeptideAtlas reflects actively ongoing research within the veterinary proteomics domain, and this article demonstrates how the expression of isoform-unique peptides can be observed across distinct tissues and body fluids. The Pig PeptideAtlas is a unique resource for use in animal proteome research, particularly biomarker discovery and for preliminary design of SRM assays, which are equally important for progress in research that supports farm animal production and veterinary health, as for developing pig models with relevance to human health research. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Pig PeptideAtlas: a resource for systems biology in animal production and biomedicine
Hesselager, Marianne O.; Codrea, Marius C.; Sun, Zhi; Deutsch, Eric W.; Bennike, Tue B.; Stensballe, Allan; Bundgaard, Louise; Moritz, Robert L.; Bendixen, Emøke
2016-01-01
Biological research of Sus scrofa, the domestic pig, is of immediate relevance for food production sciences, and for developing pig as a model organism for human biomedical research. Publicly available data repositories play a fundamental role for all biological sciences, and protein data repositories are in particular essential for the successful development of new proteomic methods. Cumulative proteome data repositories, including the PeptideAtlas, provide the means for targeted proteomics, system wide observations, and cross species observational studies, but pigs have so far been underrepresented in existing repositories. We here present a significantly improved build of the Pig PeptideAtlas, which includes pig proteome data from 25 tissues and three body fluid types mapped to 7139 canonical proteins. The content of the Pig PeptideAtlas reflects actively ongoing research within the veterinary proteomics domain, and this manuscript demonstrates how the expression of isoform-unique peptides can be observed across distinct tissues and body fluids. The Pig PeptideAtlas is a unique resource for use in animal proteome research, particularly biomarker discovery and for preliminary design of SRM assays, which are equally important for progress in research that supports farm animal production and veterinary health, as for developing pig models with relevance to human health research. PMID:26699206
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Arnold, Bill W.; Swift, Peter N.
2012-07-01
A deep borehole repository is one of the four geologic disposal system options currently under study by the U.S. DOE to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic deep borehole repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a deep borehole. A prototype performance assessment model for a generic deep borehole repository has been developed using themore » approach for a mined geological repository. The preliminary results from the simplified deep borehole generic repository performance assessment indicate that soluble, non-sorbing (or weakly sorbing) fission product radionuclides, such as I-129, Se-79 and Cl-36, are the likely major dose contributors, and that the annual radiation doses to hypothetical future humans associated with those releases may be extremely small. While much work needs to be done to validate the model assumptions and parameters, these preliminary results highlight the importance of a robust seal design in assuring long-term isolation, and suggest that deep boreholes may be a viable alternative to mined repositories for disposal of both HLW and UNF. (authors)« less
Digital Repositories and the Question of Data Usefulness
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2017-12-01
The advent of ISO standards for trustworthy long-term digital repositories provides both a set of principles to develop long-term data repositories and the instruments to assess them for trustworthiness. Such mandatory high-level requirements are broad enough to be achievable, to some extent, by many scientific data centers, archives, and other repositories. But the requirement that the data be useful in the future, the requirement that is usually considered to be most relevant to the value of the repository for its user communities, largely remains subject to various interpretations and misunderstanding. However, current and future users will be relying on repositories to preserve and disseminate the data and information needed to discover, understand, and utilize these resources to support their research, learning, and decision-making objectives. Therefore, further study is needed to determine the approaches that can be adopted by repositories to make data useful to future communities of users. This presentation will describe approaches for enabling scientific data and related information, such as software, to be useful for current and potential future user communities and will present the methodology chosen to make one science discipline's data useful for both current and future users. The method uses an ontology-based information model to define and capture the information necessary to make the data useful for contemporary and future users.
Framework for managing mycotoxin risks in the food industry.
Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie
2014-12-01
We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.
Totaro, Sara; Cotogno, Giulio; Rasmussen, Kirsten; Pianella, Francesca; Roncaglia, Marco; Olsson, Heidi; Riego Sintes, Juan M; Crutzen, Hugues P
2016-11-01
The European Commission has established a Nanomaterials Repository that hosts industrially manufactured nanomaterials that are distributed world-wide for safety testing of nanomaterials. In a first instance these materials were tested in the OECD Testing Programme. They have then also been tested in several EU funded research projects. The JRC Repository of Nanomaterials has thus developed into serving the global scientific community active in the nanoEHS (regulatory) research. The unique Repository facility is a state-of-the-art installation that allows customised sub-sampling under the safest possible conditions, with traceable final sample vials distributed world-wide for research purposes. This paper describes the design of the Repository to perform a semi-automated subsampling procedure, offering high degree of flexibility and precision in the preparation of NM vials for customers, while guaranteeing the safety of the operators, and environmental protection. The JRC nanomaterials are representative for part of the world NMs market. Their wide use world-wide facilitates the generation of comparable and reliable experimental results and datasets in (regulatory) research by the scientific community, ultimately supporting the further development of the OECD regulatory test guidelines. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Scenario Analysis for the Safety Assessment of Nuclear Waste Repositories: A Critical Review.
Tosoni, Edoardo; Salo, Ahti; Zio, Enrico
2018-04-01
A major challenge in scenario analysis for the safety assessment of nuclear waste repositories pertains to the comprehensiveness of the set of scenarios selected for assessing the safety of the repository. Motivated by this challenge, we discuss the aspects of scenario analysis relevant to comprehensiveness. Specifically, we note that (1) it is necessary to make it clear why scenarios usually focus on a restricted set of features, events, and processes; (2) there is not yet consensus on the interpretation of comprehensiveness for guiding the generation of scenarios; and (3) there is a need for sound approaches to the treatment of epistemic uncertainties. © 2017 Society for Risk Analysis.
SemanticOrganizer: A Customizable Semantic Repository for Distributed NASA Project Teams
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Berrios, Daniel C.; Carvalho, Robert E.; Hall, David R.; Rich, Stephen J.; Sturken, Ian B.; Swanson, Keith J.; Wolfe, Shawn R.
2004-01-01
SemanticOrganizer is a collaborative knowledge management system designed to support distributed NASA projects, including diverse teams of scientists, engineers, and accident investigators. The system provides a customizable, semantically structured information repository that stores work products relevant to multiple projects of differing types. SemanticOrganizer is one of the earliest and largest semantic web applications deployed at NASA to date, and has been used in diverse contexts ranging from the investigation of Space Shuttle Columbia's accident to the search for life on other planets. Although the underlying repository employs a single unified ontology, access control and ontology customization mechanisms make the repository contents appear different for each project team. This paper describes SemanticOrganizer, its customization facilities, and a sampling of its applications. The paper also summarizes some key lessons learned from building and fielding a successful semantic web application across a wide-ranging set of domains with diverse users.
The Geant4 physics validation repository
NASA Astrophysics Data System (ADS)
Wenzel, H.; Yarba, J.; Dotti, A.
2015-12-01
The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.
Retrieving relevant time-course experiments: a study on Arabidopsis microarrays.
Şener, Duygu Dede; Oğul, Hasan
2016-06-01
Understanding time-course regulation of genes in response to a stimulus is a major concern in current systems biology. The problem is usually approached by computational methods to model the gene behaviour or its networked interactions with the others by a set of latent parameters. The model parameters can be estimated through a meta-analysis of available data obtained from other relevant experiments. The key question here is how to find the relevant experiments which are potentially useful in analysing current data. In this study, the authors address this problem in the context of time-course gene expression experiments from an information retrieval perspective. To this end, they introduce a computational framework that takes a time-course experiment as a query and reports a list of relevant experiments retrieved from a given repository. These retrieved experiments can then be used to associate the environmental factors of query experiment with the findings previously reported. The model is tested using a set of time-course Arabidopsis microarrays. The experimental results show that relevant experiments can be successfully retrieved based on content similarity.
Collaborative Information Retrieval Method among Personal Repositories
NASA Astrophysics Data System (ADS)
Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro
In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.
Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository
Cimino, James J.; Remennick, Lyubov
2014-01-01
Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344
The Geant4 physics validation repository
Wenzel, H.; Yarba, J.; Dotti, A.
2015-12-23
The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described
Enriching text with images and colored light
NASA Astrophysics Data System (ADS)
Sekulovski, Dragan; Geleijnse, Gijs; Kater, Bram; Korst, Jan; Pauws, Steffen; Clout, Ramon
2008-01-01
We present an unsupervised method to enrich textual applications with relevant images and colors. The images are collected by querying large image repositories and subsequently the colors are computed using image processing. A prototype system based on this method is presented where the method is applied to song lyrics. In combination with a lyrics synchronization algorithm the system produces a rich multimedia experience. In order to identify terms within the text that may be associated with images and colors, we select noun phrases using a part of speech tagger. Large image repositories are queried with these terms. Per term representative colors are extracted using the collected images. Hereto, we either use a histogram-based or a mean shift-based algorithm. The representative color extraction uses the non-uniform distribution of the colors found in the large repositories. The images that are ranked best by the search engine are displayed on a screen, while the extracted representative colors are rendered on controllable lighting devices in the living room. We evaluate our method by comparing the computed colors to standard color representations of a set of English color terms. A second evaluation focuses on the distance in color between a queried term in English and its translation in a foreign language. Based on results from three sets of terms, a measure of suitability of a term for color extraction based on KL Divergence is proposed. Finally, we compare the performance of the algorithm using either the automatically indexed repository of Google Images and the manually annotated Flickr.com. Based on the results of these experiments, we conclude that using the presented method we can compute the relevant color for a term using a large image repository and image processing.
Bio-repository of post-clinical test samples at the national cancer center hospital (NCCH) in Tokyo.
Furuta, Koh; Yokozawa, Karin; Takada, Takako; Kato, Hoichi
2009-08-01
We established the Bio-repository at the National Cancer Center Hospital in October 2002. The main purpose of this article is to show the importance and usefulness of a bio-repository of post-clinical test samples not only for translational cancer research but also for routine clinical oncology by introducing the experience of setting up such a facility. Our basic concept of a post-clinical test sample is not as left-over waste, but rather as frozen evidence of a patient's pathological condition at a particular point. We can decode, if not all, most of the laboratory data from a post-clinical test sample. As a result, the bio-repository is able to provide not only the samples, but potentially all related laboratory data upon request. The areas of sample coverage are the following: sera after routine blood tests; sera after cross-match tests for transfusion; serum or plasma submitted at a patient's clinically important time period by the physician; and samples collected by the individual investigator. The formats of stored samples are plasma or serum, dried blood spot (DBS) and buffy coat. So far, 150 218 plasmas or sera, 35 253 DBS and 536 buffy coats have been registered for our bio-repository system. We arranged to provide samples to various concerned parties under strict legal and ethical agreements. Although the number of the utilized samples was initially limited, the inquiries for sample utilization are now increasing steadily from both research and clinical sources. Further efforts to increase the benefits of the repository are intended.
Building Scientific Data's list of recommended data repositories
NASA Astrophysics Data System (ADS)
Hufton, A. L.; Khodiyar, V.; Hrynaszkiewicz, I.
2016-12-01
When Scientific Data launched in 2014 we provided our authors with a list of recommended data repositories to help them identify data hosting options that were likely to meet the journal's requirements. This list has grown in size and scope, and is now a central resource for authors across the Nature-titled journals. It has also been used in the development of data deposition policies and recommended repository lists across Springer Nature and at other publishers. Each new addition to the list is assessed according to a series of criteria that emphasize the stability of the resource, its commitment to principles of open science and its implementation of relevant community standards and reporting guidelines. A preference is expressed for repositories that issue digital object identifiers (DOIs) through the DataCite system and that share data under the Creative Commons CC0 waiver. Scientific Data currently lists fourteen repositories that focus on specific areas within the Earth and environmental sciences, as well as the broad scope repositories, Dryad and figshare. Readers can browse and filter datasets published at the journal by the host repository using ISA-explorer, a demo tool built by the ISA-tools team at Oxford University1. We believe that well-maintained lists like this one help publishers build a network of trust with community data repositories and provide an important complement to more comprehensive data repository indices and more formal certification efforts. In parallel, Scientific Data has also improved its policies to better support submissions from authors using institutional and project-specific repositories, without requiring each to apply for listing individually. Online resources Journal homepage: http://www.nature.com/scientificdata Data repository criteria: http://www.nature.com/sdata/policies/data-policies#repo-criteria Recommended data repositories: http://www.nature.com/sdata/policies/repositories Archived copies of the list: https://dx.doi.org/10.6084/m9.figshare.1434640.v6 Reference Gonzalez-Beltran, A. ISA-explorer: A demo tool for discovering and exploring Scientific Data's ISA-tab metadata. Scientific Data Updates http://blogs.nature.com/scientificdata/2015/12/17/isa-explorer/ (2015).
Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.
Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor
2016-01-01
In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.
An International Review of the Development and Implementation of Shared Print Storage
ERIC Educational Resources Information Center
Genoni, Paul
2013-01-01
This article undertakes a review of the literature related to shared print storage and national repositories from 1980-2013. There is a separate overview of the relevant Australian literature. The coverage includes both relevant journal literature and major reports. In the process the article traces the developments in the theory and practice of…
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2011-12-01
Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.
National Aeronautics and Space Administration Biological Specimen Repository
NASA Technical Reports Server (NTRS)
McMonigal, Kathleen A.; Pietrzyk, Robert a.; Johnson, Mary Anne
2008-01-01
The National Aeronautics and Space Administration Biological Specimen Repository (Repository) is a storage bank that is used to maintain biological specimens over extended periods of time and under well-controlled conditions. Samples from the International Space Station (ISS), including blood and urine, will be collected, processed and archived during the preflight, inflight and postflight phases of ISS missions. This investigation has been developed to archive biosamples for use as a resource for future space flight related research. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can study space flight related changes and investigate physiological markers. The development of the National Aeronautics and Space Administration Biological Specimen Repository will allow for the collection, processing, storage, maintenance, and ethical distribution of biosamples to meet goals of scientific and programmatic relevance to the space program. Archiving of the biosamples will provide future research opportunities including investigating patterns of physiological changes, analysis of components unknown at this time or analyses performed by new methodologies.
NASA Astrophysics Data System (ADS)
Mao, N. H.; Ramirez, A. L.
1980-10-01
Developments in measurement technology are presented which are relevant to the studies of deep geological repositories for nuclear waste disposal during all phases of development, i.e., site selection, site characterization, construction, operation, and decommission. Emphasis was placed on geophysics and geotechnics with special attention to those techniques applicable to bedded salt. The techniques are grouped into sections as follows: tectonic environment, state of stress, subsurface structures, fractures, stress changes, deformation, thermal properties, fluid transport properties, and other approaches. Several areas that merit further research and developments are identified. These areas are: in situ thermal measurement techniques, fracture detection and characterization, in situ stress measurements, and creep behavior. The available instrumentations should generally be improved to have better resolution and accuracy, enhanced instrument survivability, and reliability for extended time periods in a hostile environment.
The visualization and availability of experimental research data at Elsevier
NASA Astrophysics Data System (ADS)
Keall, Bethan
2014-05-01
In the digital age, the visualization and availability of experimental research data is an increasingly prominent aspect of the research process and of the scientific output that researchers generate. We expect that the importance of data will continue to grow, driven by technological advancements, requirements from funding bodies to make research data available, and a developing research data infrastructure that is supported by data repositories, science publishers, and other stakeholders. Elsevier is actively contributing to these efforts, for example by setting up bidirectional links between online articles on ScienceDirect and relevant data sets on trusted data repositories. A key aspect of Elsevier's "Article of the Future" program, these links enrich the online article and make it easier for researchers to find relevant data and articles and help place data in the right context for re-use. Recently, we have set up such links with some of the leading data repositories in Earth Sciences, including the British Geological Survey, Integrated Earth Data Applications, the UK Natural Environment Research Council, and the Oak Ridge National Laboratory DAAC. Building on these links, Elsevier has also developed a number of data integration and visualization tools, such as an interactive map viewer that displays the locations of relevant data from PANGAEA next to articles on ScienceDirect. In this presentation we will give an overview of these and other capabilities of the Article of the Future, focusing on how they help advance communication of research in the digital age.
Review of DOE Waste Package Program. Semiannual report, October 1984-March 1985. Volume 8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, M.S.
1985-12-01
A large number of technical reports on waste package component performance were reviewed over the last year in support of the NRC`s review of the Department of Energy`s (DOE`s) Environmental Assessment reports. The intent was to assess in some detail the quantity and quality of the DOE data and their relevance to the high-level waste repository site selection process. A representative selection of the reviews is presented for the salt, basalt, and tuff repository projects. Areas for future research have been outlined. 141 refs.
Methods and apparatus for distributed resource discovery using examples
NASA Technical Reports Server (NTRS)
Chang, Yuan-Chi (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Hill, Matthew L. (Inventor); Bergman, Lawrence David (Inventor); Castelli, Vittorio (Inventor)
2005-01-01
Distributed resource discovery is an essential step for information retrieval and/or providing information services. This step is usually used for determining the location of an information or data repository which has relevant information. The most fundamental challenge is the usual lack of semantic interoperability of the requested resource. In accordance with the invention, a method is disclosed where distributed repositories achieve semantic interoperability through the exchange of examples and, optionally, classifiers. The outcome of the inventive method can be used to determine whether common labels are referring to the same semantic meaning.
Gómez, Alberto; Nieto-Díaz, Manuel; Del Águila, Ángela; Arias, Enrique
2018-05-01
Transparency in science is increasingly a hot topic. Scientists are required to show not only results but also evidence of how they have achieved these results. In experimental studies of spinal cord injury, there are a number of standardized tests, such as the Basso-Beattie-Bresnahan locomotor rating scale for rats and Basso Mouse Scale for mice, which researchers use to study the pathophysiology of spinal cord injury and to evaluate the effects of experimental therapies. Although the standardized data from the Basso-Beattie-Bresnahan locomotor rating scale and the Basso Mouse Scale are particularly suited for storage and sharing in databases, systems of data acquisition and repositories are still lacking. To the best of our knowledge, both tests are usually conducted manually, with the data being recorded on a paper form, which may be documented with video recordings, before the data is transferred to a spreadsheet for analysis. The data thus obtained is used to compute global scores, which is the information that usually appears in publications, with a wealth of information being omitted. This information may be relevant to understand locomotion deficits or recovery, or even important aspects of the treatment effects. Therefore, this paper presents a mobile application to record and share Basso Mouse Scale tests, meeting the following criteria: i) user-friendly; ii) few hardware requirements (only a smartphone or tablet with a camera running under Android Operating System); and iii) based on open source software such as SQLite, XML, Java, Android Studio and Android SDK. The BAMOS app can be downloaded and installed from the Google Market repository and the app code is available at the GitHub repository. The BAMOS app demonstrates that mobile technology constitutes an opportunity to develop tools for aiding spinal cord injury scientists in recording and sharing experimental data. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Comparison of Subject and Institutional Repositories in Self-Archiving Practices
ERIC Educational Resources Information Center
Xia, Jingfeng
2008-01-01
The disciplinary culture theory presumes that if a scholar has been familiar with self-archiving through an existing subject-based repository, this scholar will be more enthusiastic about contributing his/her research to an institutional repository than one who has not had the experience. To test the theory, this article examines self-archiving…
Chen, Henry W; Du, Jingcheng; Song, Hsing-Yi; Liu, Xiangyu; Jiang, Guoqian
2018-01-01
Background Today, there is an increasing need to centralize and standardize electronic health data within clinical research as the volume of data continues to balloon. Domain-specific common data elements (CDEs) are emerging as a standard approach to clinical research data capturing and reporting. Recent efforts to standardize clinical study CDEs have been of great benefit in facilitating data integration and data sharing. The importance of the temporal dimension of clinical research studies has been well recognized; however, very few studies have focused on the formal representation of temporal constraints and temporal relationships within clinical research data in the biomedical research community. In particular, temporal information can be extremely powerful to enable high-quality cancer research. Objective The objective of the study was to develop and evaluate an ontological approach to represent the temporal aspects of cancer study CDEs. Methods We used CDEs recorded in the National Cancer Institute (NCI) Cancer Data Standards Repository (caDSR) and created a CDE parser to extract time-relevant CDEs from the caDSR. Using the Web Ontology Language (OWL)–based Time Event Ontology (TEO), we manually derived representative patterns to semantically model the temporal components of the CDEs using an observing set of randomly selected time-related CDEs (n=600) to create a set of TEO ontological representation patterns. In evaluating TEO’s ability to represent the temporal components of the CDEs, this set of representation patterns was tested against two test sets of randomly selected time-related CDEs (n=425). Results It was found that 94.2% (801/850) of the CDEs in the test sets could be represented by the TEO representation patterns. Conclusions In conclusion, TEO is a good ontological model for representing the temporal components of the CDEs recorded in caDSR. Our representative model can harness the Semantic Web reasoning and inferencing functionalities and present a means for temporal CDEs to be machine-readable, streamlining meaningful searches. PMID:29472179
Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul
2016-02-15
The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sun, Alexander Y.; Lu, Jiemin; Islam, Akand
2017-05-01
Geologic repositories are extensively used for disposing byproducts in mineral and energy industries. The safety and reliability of these repositories are a primary concern to environmental regulators and the public. Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, an operator may identify the potential repository anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures. The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results are further analyzed by developing a 3D flow model, using which the model parameters are estimated through frequency domain inversion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manteufel, R.D.; Ahola, M.P.; Turner, D.R.
A literature review has been conducted to determine the state of knowledge available in the modeling of coupled thermal (T), hydrologic (H), mechanical (M), and chemical (C) processes relevant to the design and/or performance of the proposed high-level waste (HLW) repository at Yucca Mountain, Nevada. The review focuses on identifying coupling mechanisms between individual processes and assessing their importance (i.e., if the coupling is either important, potentially important, or negligible). The significance of considering THMC-coupled processes lies in whether or not the processes impact the design and/or performance objectives of the repository. A review, such as reported here, is usefulmore » in identifying which coupled effects will be important, hence which coupled effects will need to be investigated by the US Nuclear Regulatory Commission in order to assess the assumptions, data, analyses, and conclusions in the design and performance assessment of a geologic reposit``. Although this work stems from regulatory interest in the design of the geologic repository, it should be emphasized that the repository design implicitly considers all of the repository performance objectives, including those associated with the time after permanent closure. The scope of this review is considered beyond previous assessments in that it attempts with the current state-of-knowledge) to determine which couplings are important, and identify which computer codes are currently available to model coupled processes.« less
Automatic indexing and retrieval of encounter-specific evidence for point-of-care support.
O'Sullivan, Dympna M; Wilk, Szymon A; Michalowski, Wojtek J; Farion, Ken J
2010-08-01
Evidence-based medicine relies on repositories of empirical research evidence that can be used to support clinical decision making for improved patient care. However, retrieving evidence from such repositories at local sites presents many challenges. This paper describes a methodological framework for automatically indexing and retrieving empirical research evidence in the form of the systematic reviews and associated studies from The Cochrane Library, where retrieved documents are specific to a patient-physician encounter and thus can be used to support evidence-based decision making at the point of care. Such an encounter is defined by three pertinent groups of concepts - diagnosis, treatment, and patient, and the framework relies on these three groups to steer indexing and retrieval of reviews and associated studies. An evaluation of the indexing and retrieval components of the proposed framework was performed using documents relevant for the pediatric asthma domain. Precision and recall values for automatic indexing of systematic reviews and associated studies were 0.93 and 0.87, and 0.81 and 0.56, respectively. Moreover, precision and recall for the retrieval of relevant systematic reviews and associated studies were 0.89 and 0.81, and 0.92 and 0.89, respectively. With minor modifications, the proposed methodological framework can be customized for other evidence repositories. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pohlmann, K. F.; Zhu, J.; Ye, M.; Carroll, R. W.; Chapman, J. B.; Russell, C. E.; Shafer, D. S.
2006-12-01
Yucca Mountain (YM), Nevada has been recommended as a deep geological repository for the disposal of spent fuel and high-level radioactive waste. If YM is licensed as a repository by the Nuclear Regulatory Commission, it will be important to identify the potential for radionuclides to migrate from underground nuclear testing areas located on the Nevada Test Site (NTS) to the hydraulically downgradient repository area to ensure that monitoring does not incorrectly attribute repository failure to radionuclides originating from other sources. In this study, we use the Death Valley Regional Flow System (DVRFS) model developed by the U.S. Geological Survey to investigate potential groundwater migration pathways and associated travel times from the NTS to the proposed YM repository area. Using results from the calibrated DVRFS model and the particle tracking post-processing package MODPATH we modeled three-dimensional groundwater advective pathways in the NTS and YM region. Our study focuses on evaluating the potential for groundwater pathways between the NTS and YM withdrawal area and whether travel times for advective flow along these pathways coincide with the prospective monitoring time frame at the proposed repository. We include uncertainty in effective porosity as this is a critical variable in the determination of time for radionuclides to travel from the NTS region to the YM withdrawal area. Uncertainty in porosity is quantified through evaluation of existing site data and expert judgment and is incorporated in the model through Monte Carlo simulation. Since porosity information is limited for this region, the uncertainty is quite large and this is reflected in the results as a large range in simulated groundwater travel times.
National Pesticide Standard Repository
EPA's National Pesticide Standards Repository collects and maintains an inventory of analytical “standards” of registered pesticides in the United States, as well as some that are not currently registered for food and product testing and monitoring.
Evaluation of Five Sedimentary Rocks Other Than Salt for Geologic Repository Siting Purposes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croff, A.G.; Lomenick, T.F.; Lowrie, R.S.
The US Department of Energy (DOE), in order to increase the diversity of rock types under consideration by the geologic disposal program, initiated the Sedimary ROck Program (SERP), whose immediate objectiv eis to evaluate five types of secimdnary rock - sandstone, chalk, carbonate rocks (limestone and dolostone), anhydrock, and shale - to determine the potential for siting a geologic repository. The evaluation of these five rock types, together with the ongoing salt studies, effectively results in the consideration of all types of relatively impermeable sedimentary rock for repository purposes. The results of this evaluation are expressed in terms of amore » ranking of the five rock types with respect to their potential to serve as a geologic repository host rock. This comparative evaluation was conducted on a non-site-specific basis, by use of generic information together with rock evaluation criteria (RECs) derived from the DOE siting guidelines for geologic repositories (CFR 1984). An information base relevant to rock evaluation using these RECs was developed in hydrology, geochemistry, rock characteristics (rock occurrences, thermal response, rock mechanics), natural resources, and rock dissolution. Evaluation against postclosure and preclosure RECs yielded a ranking of the five subject rocks with respect to their potential as repository host rocks. Shale was determined to be the most preferred of the five rock types, with sandstone a distant second, the carbonate rocks and anhydrock a more distant third, and chalk a relatively close fourth.« less
Safeguarding structural data repositories against bad apples
Minor, Wladek; Dauter, Zbigniew; Helliwell, John R.; Jaskolski, Mariusz; Wlodawer, Alexander
2016-01-01
Structural biology research generates large amounts of data, some deposited in public databases/repositories, but a substantial remainder never becoming available to the scientific community. Additionally, some of the deposited data contain less or more serious errors that may bias the results of data mining. Thorough analysis and discussion of these problems is needed in order to ameliorate this situation. This note is an attempt to propose some solutions and encourage both further discussion and action on the part of the relevant organizations, in particular the Protein Data Bank and various bodies of the International Union of Crystallography. PMID:26840827
A hybrid organic-inorganic perovskite dataset
NASA Astrophysics Data System (ADS)
Kim, Chiho; Huan, Tran Doan; Krishnan, Sridevi; Ramprasad, Rampi
2017-05-01
Hybrid organic-inorganic perovskites (HOIPs) have been attracting a great deal of attention due to their versatility of electronic properties and fabrication methods. We prepare a dataset of 1,346 HOIPs, which features 16 organic cations, 3 group-IV cations and 4 halide anions. Using a combination of an atomic structure search method and density functional theory calculations, the optimized structures, the bandgap, the dielectric constant, and the relative energies of the HOIPs are uniformly prepared and validated by comparing with relevant experimental and/or theoretical data. We make the dataset available at Dryad Digital Repository, NoMaD Repository, and Khazana Repository (http://khazana.uconn.edu/), hoping that it could be useful for future data-mining efforts that can explore possible structure-property relationships and phenomenological models. Progressive extension of the dataset is expected as new organic cations become appropriate within the HOIP framework, and as additional properties are calculated for the new compounds found.
Audit and Certification Process for Science Data Digital Repositories
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.
2011-12-01
Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.
On-line remote monitoring of radioactive waste repositories
NASA Astrophysics Data System (ADS)
Calì, Claudio; Cosentino, Luigi; Litrico, Pietro; Pappalardo, Alfio; Scirè, Carlotta; Scirè, Sergio; Vecchio, Gianfranco; Finocchiaro, Paolo; Alfieri, Severino; Mariani, Annamaria
2014-12-01
A low-cost array of modular sensors for online monitoring of radioactive waste was developed at INFN-LNS. We implemented a new kind of gamma counter, based on Silicon PhotoMultipliers and scintillating fibers, that behaves like a cheap scintillating Geiger-Muller counter. It can be placed in shape of a fine grid around each single waste drum in a repository. Front-end electronics and an FPGA-based counting system were developed to handle the field data, also implementing data transmission, a graphical user interface and a data storage system. A test of four sensors in a real radwaste storage site was performed with promising results. Following the tests an agreement was signed between INFN and Sogin for the joint development and installation of a prototype DMNR (Detector Mesh for Nuclear Repository) system inside the Garigliano radwaste repository in Sessa Aurunca (CE, Italy). Such a development is currently under way, with the installation foreseen within 2014.
Characterization of Heat-treated Clay Minerals in the Context of Nuclear Waste Disposal
NASA Astrophysics Data System (ADS)
Matteo, E. N.; Wang, Y.; Kruichak, J. N.; Mills, M. M.
2015-12-01
Clay minerals are likely candidates to aid in nuclear waste isolation due to their low permeability, favorable swelling properties, and high cation sorption capacities. Establishing the thermal limit for clay minerals in a nuclear waste repository is a potentially important component of repository design, as flexibility of the heat load within the repository can have a major impact on the selection of repository design. For example, the thermal limit plays a critical role in the time that waste packages would need to cool before being transferred to the repository. Understanding the chemical and physical changes, if any, that occur in clay minerals at various temperatures above the current thermal limit (of 100 °C) can enable decision-makers with information critical to evaluating the potential trade-offs of increasing the thermal limit within the repository. Most critical is gaining understanding of how varying thermal conditions in the repository will impact radionuclide sorption and transport in clay materials either as engineered barriers or as disposal media. A variety of repository-relevant clay minerals (illite, mixed layer illite/smectite, and montmorillonite), were heated for a range of temperatures between 100-1000 °C. These samples were characterized to determine surface area, mineralogical alteration, and cation exchange capacity (CEC). Our results show that for conditions up to 500 °C, no significant change occurs, so long as the clay mineral remains mineralogically intact. At temperatures above 500 °C, transformation of the layered silicates into silica phases leads to alteration that impacts important clay characteristics. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's Nation Nuclear Security Administration under contract DE-AC04-94AL85000. SAND Number: SAND2015-6524 A
Test Plan: WIPP bin-scale CH TRU waste tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molecke, M.A.
1990-08-01
This WIPP Bin-Scale CH TRU Waste Test program described herein will provide relevant composition and kinetic rate data on gas generation and consumption resulting from TRU waste degradation, as impacted by synergistic interactions due to multiple degradation modes, waste form preparation, long-term repository environmental effects, engineered barrier materials, and, possibly, engineered modifications to be developed. Similar data on waste-brine leachate compositions and potentially hazardous volatile organic compounds released by the wastes will also be provided. The quantitative data output from these tests and associated technical expertise are required by the WIPP Performance Assessment (PA) program studies, and for the scientificmore » benefit of the overall WIPP project. This Test Plan describes the necessary scientific and technical aspects, justifications, and rational for successfully initiating and conducting the WIPP Bin-Scale CH TRU Waste Test program. This Test Plan is the controlling scientific design definition and overall requirements document for this WIPP in situ test, as defined by Sandia National Laboratories (SNL), scientific advisor to the US Department of Energy, WIPP Project Office (DOE/WPO). 55 refs., 16 figs., 19 tabs.« less
Native American Art and Culture: Documentary Resources.
ERIC Educational Resources Information Center
Lawrence, Deirdre
1992-01-01
Presents a brief overview of the evolution of documentary material of Native American cultures and problems confronted by researchers in locating relevant information. Bibliographic sources for research are discussed and a directory of major repositories of Native American art documentation is provided. (EA)
High Integrity Can Design Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaber, E.L.
1998-08-01
The National Spent Nuclear Fuel Program is chartered with facilitating the disposition of DOE-owned spent nuclear fuel to allow disposal at a geologic repository. This is done through coordination with the repository program and by assisting DOE Site owners of SNF with needed information, standardized requirements, packaging approaches, etc. The High Integrity Can (HIC) will be manufactured to provide a substitute or barrier enhancement for normal fuel geometry and cladding. The can would be nested inside the DOE standardized canister which is designed to interface with the repository waste package. The HIC approach may provide the following benefits over typicalmore » canning approaches for DOE SNF. (a) It allows ready calculation and management of criticality issues for miscellaneous. (b) It segments and further isolates damaged or otherwise problem materials from normal SNF in the repository package. (c) It provides a very long term corrosion barrier. (d) It provides an extra internal pressure barrier for particulates, gaseous fission products, hydrogen, and water vapor. (e) It delays any potential release of fission products to the repository environment. (f) It maintains an additional level of fuel geometry control during design basis accidents, rock-fall, and seismic events. (g) When seal welded, it could provide the additional containment required for shipments involving plutonium content in excess of 20 Ci. (10 CFR 71.63.b) if integrated with an appropriate cask design. Long term corrosion protection is central to the HIC concept. The material selected for the HIC (Hastelloy C-22) has undergone extensive testing for repository service. The most severe theoretical interactions between iron, repository water containing chlorides and other repository construction materials have been tested. These expected chemical species have not been shown capable of corroding the selected HIC material. Therefore, the HIC should provide a significant barrier to DOE SNF dispersal long after most commercial SNF has degraded and begun moving into the repository environment.« less
DataMed - an open source discovery index for finding biomedical datasets.
Chen, Xiaoling; Gururaj, Anupama E; Ozyurt, Burak; Liu, Ruiling; Soysal, Ergin; Cohen, Trevor; Tiryaki, Firat; Li, Yueling; Zong, Nansu; Jiang, Min; Rogith, Deevakar; Salimi, Mandana; Kim, Hyeon-Eui; Rocca-Serra, Philippe; Gonzalez-Beltran, Alejandra; Farcas, Claudiu; Johnson, Todd; Margolis, Ron; Alter, George; Sansone, Susanna-Assunta; Fore, Ian M; Ohno-Machado, Lucila; Grethe, Jeffrey S; Xu, Hua
2018-01-13
Finding relevant datasets is important for promoting data reuse in the biomedical domain, but it is challenging given the volume and complexity of biomedical data. Here we describe the development of an open source biomedical data discovery system called DataMed, with the goal of promoting the building of additional data indexes in the biomedical domain. DataMed, which can efficiently index and search diverse types of biomedical datasets across repositories, is developed through the National Institutes of Health-funded biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium. It consists of 2 main components: (1) a data ingestion pipeline that collects and transforms original metadata information to a unified metadata model, called DatA Tag Suite (DATS), and (2) a search engine that finds relevant datasets based on user-entered queries. In addition to describing its architecture and techniques, we evaluated individual components within DataMed, including the accuracy of the ingestion pipeline, the prevalence of the DATS model across repositories, and the overall performance of the dataset retrieval engine. Our manual review shows that the ingestion pipeline could achieve an accuracy of 90% and core elements of DATS had varied frequency across repositories. On a manually curated benchmark dataset, the DataMed search engine achieved an inferred average precision of 0.2033 and a precision at 10 (P@10, the number of relevant results in the top 10 search results) of 0.6022, by implementing advanced natural language processing and terminology services. Currently, we have made the DataMed system publically available as an open source package for the biomedical community. © The Author 2018. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Safeguarding Structural Data Repositories against Bad Apples.
Minor, Wladek; Dauter, Zbigniew; Helliwell, John R; Jaskolski, Mariusz; Wlodawer, Alexander
2016-02-02
Structural biology research generates large amounts of data, some deposited in public databases or repositories, but a substantial remainder never becomes available to the scientific community. In addition, some of the deposited data contain less or more serious errors that may bias the results of data mining. Thorough analysis and discussion of these problems is needed to ameliorate this situation. This perspective is an attempt to propose some solutions and encourage both further discussion and action on the part of the relevant organizations, in particular the PDB and various bodies of the International Union of Crystallography. Copyright © 2016 Elsevier Ltd. All rights reserved.
Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tillerson, J.R.; Nimick, F.B.
1984-12-01
The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less
Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses
Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M.; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V.; Ma’ayan, Avi
2018-01-01
Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated ‘canned’ analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools. PMID:29485625
Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses.
Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V; Ma'ayan, Avi
2018-02-27
Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated 'canned' analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools.
PDB-wide identification of biological assemblies from conserved quaternary structure geometry.
Dey, Sucharita; Ritchie, David W; Levy, Emmanuel D
2018-01-01
Protein structures are key to understanding biomolecular mechanisms and diseases, yet their interpretation is hampered by limited knowledge of their biologically relevant quaternary structure (QS). A critical challenge in inferring QS information from crystallographic data is distinguishing biological interfaces from fortuitous crystal-packing contacts. Here, we tackled this problem by developing strategies for aligning and comparing QS states across both homologs and data repositories. QS conservation across homologs proved remarkably strong at predicting biological relevance and is implemented in two methods, QSalign and anti-QSalign, for annotating homo-oligomers and monomers, respectively. QS conservation across repositories is implemented in QSbio (http://www.QSbio.org), which approaches the accuracy of manual curation and allowed us to predict >100,000 QS states across the Protein Data Bank. Based on this high-quality data set, we analyzed pairs of structurally conserved interfaces, and this analysis revealed a striking plasticity whereby evolutionary distant interfaces maintain similar interaction geometries through widely divergent chemical properties.
75 FR 71133 - National Institute of Mental Health; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
... Emphasis Panel; Competitive Revision for Stem Cell Repository Relevant to Mental Disorders. Date: December... Domestic Assistance Program Nos. 93.242, Mental Health Research Grants; 93.281, Scientist Development Award, Scientist Development Award for Clinicians, and Research Scientist Award; 93.282, Mental Health National...
20 CFR 418.3125 - What are redeterminations?
Code of Federal Regulations, 2010 CFR
2010-04-01
... from data exchanges with Federal and State agencies that may affect whether you should receive a full... random sample of cases for quality assurance purposes. For each collection of sample cases, all factors... with primary repositories of information relevant to each individual factor (e.g., we may contact...
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; de Sherbinin, A. M.
2017-12-01
Growing recognition of the importance of sharing scientific data more widely and openly has refocused attention on the state of data repositories, including both discipline- or topic-oriented data centers and institutional repositories. Data creators often have several alternatives for depositing and disseminating their natural, social, health, or engineering science data. In selecting a repository for their data, data creators and other stakeholders such as their funding agencies may wish to consider the user community or communities served, the type and quality of data products already offered, and the degree of data stewardship and associated services provided. Some data repositories serve general communities, e.g., those in their host institution or region, whereas others tailor their services to particular scientific disciplines or topical areas. Some repositories are selective when acquiring data and conduct extensive curation and reviews to ensure that data products meet quality standards. Many repositories have secured credentials and established a track record for providing trustworthy, high quality data and services. The NASA Socioeconomic Data and Applications Center (SEDAC) serves users interested in human-environment interactions, including researchers, students, and applied users from diverse sectors. SEDAC is selective when choosing data for dissemination, conducting several reviews of data products and services prior to release. SEDAC works with data producers to continually improve the quality of its open data products and services. As a Distributed Active Archive Center (DAAC) of the NASA Earth Observing System Data and Information System, SEDAC is committed to improving the accessibility, interoperability, and usability of its data in conjunction with data available from other DAACs, as well as other relevant data sources. SEDAC is certified as a Regular Member of the International Council for Science World Data System (ICSU-WDS).
A biosphere assessment of high-level radioactive waste disposal in Sweden.
Kautsky, Ulrik; Lindborg, Tobias; Valentin, Jack
2015-04-01
Licence applications to build a repository for the disposal of Swedish spent nuclear fuel have been lodged, underpinned by myriad reports and several broader reviews. This paper sketches out the technical and administrative aspects and highlights a recent review of the biosphere effects of a potential release from the repository. A comprehensive database and an understanding of major fluxes and pools of water and organic matter in the landscape let one envisage the future by looking at older parts of the site. Thus, today's biosphere is used as a natural analogue of possible future landscapes. It is concluded that the planned repository can meet the safety criteria and will have no detectable radiological impact on plants and animals. This paper also briefly describes biosphere work undertaken after the review. The multidisciplinary approach used is relevant in a much wider context and may prove beneficial across many environmental contexts. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Deck, John; Gaither, Michelle R; Ewing, Rodney; Bird, Christopher E; Davies, Neil; Meyer, Christopher; Riginos, Cynthia; Toonen, Robert J; Crandall, Eric D
2017-08-01
The Genomic Observatories Metadatabase (GeOMe, http://www.geome-db.org/) is an open access repository for geographic and ecological metadata associated with biosamples and genetic data. Whereas public databases have served as vital repositories for nucleotide sequences, they do not accession all the metadata required for ecological or evolutionary analyses. GeOMe fills this need, providing a user-friendly, web-based interface for both data contributors and data recipients. The interface allows data contributors to create a customized yet standard-compliant spreadsheet that captures the temporal and geospatial context of each biosample. These metadata are then validated and permanently linked to archived genetic data stored in the National Center for Biotechnology Information's (NCBI's) Sequence Read Archive (SRA) via unique persistent identifiers. By linking ecologically and evolutionarily relevant metadata with publicly archived sequence data in a structured manner, GeOMe sets a gold standard for data management in biodiversity science.
Deck, John; Gaither, Michelle R.; Ewing, Rodney; Bird, Christopher E.; Davies, Neil; Meyer, Christopher; Riginos, Cynthia; Toonen, Robert J.; Crandall, Eric D.
2017-01-01
The Genomic Observatories Metadatabase (GeOMe, http://www.geome-db.org/) is an open access repository for geographic and ecological metadata associated with biosamples and genetic data. Whereas public databases have served as vital repositories for nucleotide sequences, they do not accession all the metadata required for ecological or evolutionary analyses. GeOMe fills this need, providing a user-friendly, web-based interface for both data contributors and data recipients. The interface allows data contributors to create a customized yet standard-compliant spreadsheet that captures the temporal and geospatial context of each biosample. These metadata are then validated and permanently linked to archived genetic data stored in the National Center for Biotechnology Information’s (NCBI’s) Sequence Read Archive (SRA) via unique persistent identifiers. By linking ecologically and evolutionarily relevant metadata with publicly archived sequence data in a structured manner, GeOMe sets a gold standard for data management in biodiversity science. PMID:28771471
García-de-León-Chocano, Ricardo; Muñoz-Soler, Verónica; Sáez, Carlos; García-de-León-González, Ricardo; García-Gómez, Juan M
2016-04-01
This is the second in a series of two papers regarding the construction of data quality (DQ) assured repositories, based on population data from Electronic Health Records (EHR), for the reuse of information on infant feeding from birth until the age of two. This second paper describes the application of the computational process of constructing the first quality-assured repository for the reuse of information on infant feeding in the perinatal period, with the aim of studying relevant questions from the Baby Friendly Hospital Initiative (BFHI) and monitoring its deployment in our hospital. The construction of the repository was carried out using 13 semi-automated procedures to assess, recover or discard clinical data. The initial information consisted of perinatal forms from EHR related to 2048 births (Facts of Study, FoS) between 2009 and 2011, with a total of 433,308 observations of 223 variables. DQ was measured before and after the procedures using metrics related to eight quality dimensions: predictive value, correctness, duplication, consistency, completeness, contextualization, temporal-stability, and spatial-stability. Once the predictive variables were selected and DQ was assured, the final repository consisted of 1925 births, 107,529 observations and 73 quality-assured variables. The amount of discarded observations mainly corresponds to observations of non-predictive variables (52.90%) and the impact of the de-duplication process (20.58%) with respect to the total input data. Seven out of thirteen procedures achieved 100% of valid births, observations and variables. Moreover, 89% of births and ~98% of observations were consistent according to the experts׳ criteria. A multidisciplinary approach along with the quantification of DQ has allowed us to construct the first repository about infant feeding in the perinatal period based on EHR population data. Copyright © 2016 Elsevier Ltd. All rights reserved.
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2014 CFR
2014-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2013 CFR
2013-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2011 CFR
2011-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.51 - License amendment for permanent closure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...
10 CFR 60.133 - Additional design criteria for the underground facility.
Code of Federal Regulations, 2012 CFR
2012-01-01
... specific site conditions identified through in situ monitoring, testing, or excavation. (c) Retrieval of waste. The underground facility shall be designed to permit retrieval of waste in accordance with the... RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository...
10 CFR 60.133 - Additional design criteria for the underground facility.
Code of Federal Regulations, 2013 CFR
2013-01-01
... specific site conditions identified through in situ monitoring, testing, or excavation. (c) Retrieval of waste. The underground facility shall be designed to permit retrieval of waste in accordance with the... RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository...
10 CFR 60.133 - Additional design criteria for the underground facility.
Code of Federal Regulations, 2011 CFR
2011-01-01
... specific site conditions identified through in situ monitoring, testing, or excavation. (c) Retrieval of waste. The underground facility shall be designed to permit retrieval of waste in accordance with the... RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository...
10 CFR 60.133 - Additional design criteria for the underground facility.
Code of Federal Regulations, 2014 CFR
2014-01-01
... specific site conditions identified through in situ monitoring, testing, or excavation. (c) Retrieval of waste. The underground facility shall be designed to permit retrieval of waste in accordance with the... RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository...
[Self-archiving of biomedical papers in open access repositories].
Abad-García, M Francisca; Melero, Remedios; Abadal, Ernest; González-Teruel, Aurora
2010-04-01
Open-access literature is digital, online, free of charge, and free of most copyright and licensing restrictions. Self-archiving or deposit of scholarly outputs in institutional repositories (open-access green route) is increasingly present in the activities of the scientific community. Besides the benefits of open access for visibility and dissemination of science, it is increasingly more often required by funding agencies to deposit papers and any other type of documents in repositories. In the biomedical environment this is even more relevant by the impact scientific literature can have on public health. However, to make self-archiving feasible, authors should be aware of its meaning and the terms in which they are allowed to archive their works. In that sense, there are some tools like Sherpa/RoMEO or DULCINEA (both directories of copyright licences of scientific journals at different levels) to find out what rights are retained by authors when they publish a paper and if they allow to implement self-archiving. PubMed Central and its British and Canadian counterparts are the main thematic repositories for biomedical fields. In our country there is none of similar nature, but most of the universities and CSIC, have already created their own institutional repositories. The increase in visibility of research results and their impact on a greater and earlier citation is one of the most frequently advance of open access, but removal of economic barriers to access to information is also a benefit to break borders between groups.
A strategy to establish Food Safety Model Repositories.
Plaza-Rodríguez, C; Thoens, C; Falenski, A; Weiser, A A; Appel, B; Kaesbohrer, A; Filter, M
2015-07-02
Transferring the knowledge of predictive microbiology into real world food manufacturing applications is still a major challenge for the whole food safety modelling community. To facilitate this process, a strategy for creating open, community driven and web-based predictive microbial model repositories is proposed. These collaborative model resources could significantly improve the transfer of knowledge from research into commercial and governmental applications and also increase efficiency, transparency and usability of predictive models. To demonstrate the feasibility, predictive models of Salmonella in beef previously published in the scientific literature were re-implemented using an open source software tool called PMM-Lab. The models were made publicly available in a Food Safety Model Repository within the OpenML for Predictive Modelling in Food community project. Three different approaches were used to create new models in the model repositories: (1) all information relevant for model re-implementation is available in a scientific publication, (2) model parameters can be imported from tabular parameter collections and (3) models have to be generated from experimental data or primary model parameters. All three approaches were demonstrated in the paper. The sample Food Safety Model Repository is available via: http://sourceforge.net/projects/microbialmodelingexchange/files/models and the PMM-Lab software can be downloaded from http://sourceforge.net/projects/pmmlab/. This work also illustrates that a standardized information exchange format for predictive microbial models, as the key component of this strategy, could be established by adoption of resources from the Systems Biology domain. Copyright © 2015. Published by Elsevier B.V.
Mining functionally relevant gene sets for analyzing physiologically novel clinical expression data.
Turcan, Sevin; Vetter, Douglas E; Maron, Jill L; Wei, Xintao; Slonim, Donna K
2011-01-01
Gene set analyses have become a standard approach for increasing the sensitivity of transcriptomic studies. However, analytical methods incorporating gene sets require the availability of pre-defined gene sets relevant to the underlying physiology being studied. For novel physiological problems, relevant gene sets may be unavailable or existing gene set databases may bias the results towards only the best-studied of the relevant biological processes. We describe a successful attempt to mine novel functional gene sets for translational projects where the underlying physiology is not necessarily well characterized in existing annotation databases. We choose targeted training data from public expression data repositories and define new criteria for selecting biclusters to serve as candidate gene sets. Many of the discovered gene sets show little or no enrichment for informative Gene Ontology terms or other functional annotation. However, we observe that such gene sets show coherent differential expression in new clinical test data sets, even if derived from different species, tissues, and disease states. We demonstrate the efficacy of this method on a human metabolic data set, where we discover novel, uncharacterized gene sets that are diagnostic of diabetes, and on additional data sets related to neuronal processes and human development. Our results suggest that our approach may be an efficient way to generate a collection of gene sets relevant to the analysis of data for novel clinical applications where existing functional annotation is relatively incomplete.
77 FR 65177 - Swap Data Repositories: Interpretative Statement Regarding the Confidentiality and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-25
... participation in standard-setting bodies to develop international standards relevant to the swap markets. Cloud Strategix, LLC (``Cloud Strategix''), representing the data hosting and cloud computing industry, in... Roundtable, June 6, 2012; (iii) Cloud Strategix, LLC, June 5, 2012; and (iv) the Depository Trust & Clearing...
75 FR 13099 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-18
....gov . Follow the instructions for submitting comments. * Mail: Federal Docket Management System Office... Liaison Officer, Department of Defense. A0600-20 DCS, G-1 SYSTEM NAME: Sexual Assault Data Management... of relevant data to the Sexual Assault Data Management System (SADMS) and a centralized repository of...
re3data.org - a global registry of research data repositories
NASA Astrophysics Data System (ADS)
Pampel, Heinz; Vierkant, Paul; Elger, Kirsten; Bertelmann, Roland; Witt, Michael; Schirmbacher, Peter; Rücknagel, Jessika; Kindling, Maxi; Scholze, Frank; Ulrich, Robert
2016-04-01
re3data.org - the registry of research data repositories lists over 1,400 research data repositories from all over the world making it the largest and most comprehensive online catalog of research data repositories on the web. The registry is a valuable tool for researchers, funding organizations, publishers and libraries. re3data.org provides detailed information about research data repositories, and its distinctive icons help researchers to easily identify relevant repositories for accessing and depositing data sets [1]. Funding agencies, like the European Commission [2] and research institutions like the University of Bielefeld [3] already recommend the use of re3data.org in their guidelines and policies. Several publishers and journals like Copernicus Publications, PeerJ, and Nature's Scientific Data recommend re3data.org in their editorial policies as a tool for the easy identification of appropriate data repositories to store research data. Project partners in re3data.org are the Library and Information Services department (LIS) of the GFZ German Research Centre for Geosciences, the Computer and Media Service at the Humboldt-Universität zu Berlin, the Purdue University Libraries and the KIT Library at the Karlsruhe Institute of Technology (KIT). After its fusion with the U.S. American DataBib in 2014, re3data.org continues as a service of DataCite from 2016 on. DataCite is the international organization for the registration of Digital Object Identifiers (DOI) for research data and aims to improve their citation. The poster describes the current status and the future plans of re3data.org. [1] Pampel H, et al. (2013) Making Research Data Repositories Visible: The re3data.org Registry. PLoS ONE 8(11): e78080. doi:10.1371/journal.pone.0078080. [2] European Commission (2015): Guidelines on Open Access to Scientific Publications and Research Data in Horizon 2020. Available: http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf Accessed 11 January 2016. [3] Bielefeld University (2013): Resolution on Research Data Management. Available: http://data.uni-bielefeld.de/en/resolution Accessed 11 January 2016.
Institutional Repositories: The Experience of Master's and Baccalaureate Institutions
ERIC Educational Resources Information Center
Markey, Karen; St. Jean, Beth; Soo, Young Rieh; Yakel, Elizabeth; Kim, Jihyun
2008-01-01
In 2006, MIRACLE Project investigators censused library directors at all U.S. academic institutions about their activities planning, pilot testing, and implementing the institutional repositories on their campuses. Out of 446 respondents, 289 (64.8 percent) were from master's and baccalaureate institutions (M&BIs) where few operational…
Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A
2016-01-01
NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.
Yang, Changbing; Samper, Javier; Molinero, Jorge; Bonilla, Mercedes
2007-08-15
Dissolved oxygen (DO) left in the voids of buffer and backfill materials of a deep geological high level radioactive waste (HLW) repository could cause canister corrosion. Available data from laboratory and in situ experiments indicate that microbes play a substantial role in controlling redox conditions near a HLW repository. This paper presents the application of a coupled hydro-bio-geochemical model to evaluate geochemical and microbial consumption of DO in bentonite porewater after backfilling of a HLW repository designed according to the Swedish reference concept. In addition to geochemical reactions, the model accounts for dissolved organic carbon (DOC) respiration and methane oxidation. Parameters for microbial processes were derived from calibration of the REX in situ experiment carried out at the Aspö underground laboratory. The role of geochemical and microbial processes in consuming DO is evaluated for several scenarios. Numerical results show that both geochemical and microbial processes are relevant for DO consumption. However, the time needed to consume the DO trapped in the bentonite buffer decreases dramatically from several hundreds of years when only geochemical processes are considered to a few weeks when both geochemical reactions and microbially-mediated DOC respiration and methane oxidation are taken into account simultaneously.
Reactive transport studies at the Raymond Field Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freifeld, B.; Karasaki, K.; Solbau, R.
1995-12-01
To ensure the safety of a nuclear waste repository, an understanding of the transport of radionuclides from the repository nearfield to the biosphere is necessary. At the Raymond Field Site, in Raymond, California, tracer tests are being conducted to test characterization methods for fractured media and to evaluate the equipment and tracers that will be used for Yucca Mountain`s fracture characterization. Recent tracer tests at Raymond have used reactive cations to demonstrate transport with sorption. A convective-dispersive model was used to simulate a two-well recirculating test with reasonable results. However, when the same model was used to simulate a radiallymore » convergent tracer test, the model poorly predicted the actual test data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitschkowetz, N.; Vickers, D.L.
This report provides a summary of the Computer-aided Acquisition and Logistic Support (CALS) Test Network (CTN) Laboratory Acceptance Test (LAT) and User Application Test (UAT) activities undertaken to evaluate the CALS capabilities being implemented as part of the Department of Defense (DOD) engineering repositories. Although the individual testing activities provided detailed reports for each repository, a synthesis of the results, conclusions, and recommendations is offered to provide a more concise presentation of the issues and the strategies, as viewed from the CTN perspective.
French Geological Repository Project for High Level and Long-Lived Waste: Scientific Programme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landais, P.; Lebon, P.; Ouzounian, G.
2008-07-01
The feasibility study presented in the Dossier 2005 Argile set out to evaluate the conditions for building, operating and managing a reversible disposal facility. The research was directed at demonstrating a potential for confining long-lived radioactive waste in a deep clay formation by establishing the feasibility of the disposal principle. Results have been enough convincing and a Planning Act was passed on 28 June, 2006. Decision in principle has been taken to dispose of intermediate and high level long-lived radioactive waste in a geological repository. An application file for a license to construct a disposal facility is requested by endmore » of 2014 and its commissioning is planned for 2025. Based on previous results as well as on recommendations made by various Dossier 2005 evaluators, a new scientific programme for 2006-2015 has been defined. It gives details of what will be covered over the 2006-2015 period. Particular emphasis is placed on consolidating scientific data, increasing understanding of certain mechanisms and using a scientific and technical integration approach. It aims at integrating scientific developments and engineering advances. The scientific work envisaged beyond 2006 has the benefit of a unique context, which is direct access to the geological medium over long timescales. It naturally extends the research carried out to date, and incorporates additional investigations of the geological medium, and the preparation of demonstration work especially through full-scale tests. Results will aim at improving the representation of repository evolutions over time, extract the relevant parameters for monitoring during the reversibility phases, reduce the parametric uncertainties and enhance the robustness of models for performance calculations and safety analyses. Structure and main orientation of the ongoing scientific programme are presented. (author)« less
Attribute Weighting Based K-Nearest Neighbor Using Gain Ratio
NASA Astrophysics Data System (ADS)
Nababan, A. A.; Sitompul, O. S.; Tulus
2018-04-01
K- Nearest Neighbor (KNN) is a good classifier, but from several studies, the result performance accuracy of KNN still lower than other methods. One of the causes of the low accuracy produced, because each attribute has the same effect on the classification process, while some less relevant characteristics lead to miss-classification of the class assignment for new data. In this research, we proposed Attribute Weighting Based K-Nearest Neighbor Using Gain Ratio as a parameter to see the correlation between each attribute in the data and the Gain Ratio also will be used as the basis for weighting each attribute of the dataset. The accuracy of results is compared to the accuracy acquired from the original KNN method using 10-fold Cross-Validation with several datasets from the UCI Machine Learning repository and KEEL-Dataset Repository, such as abalone, glass identification, haberman, hayes-roth and water quality status. Based on the result of the test, the proposed method was able to increase the classification accuracy of KNN, where the highest difference of accuracy obtained hayes-roth dataset is worth 12.73%, and the lowest difference of accuracy obtained in the abalone dataset of 0.07%. The average result of the accuracy of all dataset increases the accuracy by 5.33%.
Thiele, H.; Glandorf, J.; Koerting, G.; Reidegeld, K.; Blüggel, M.; Meyer, H.; Stephan, C.
2007-01-01
In today’s proteomics research, various techniques and instrumentation bioinformatics tools are necessary to manage the large amount of heterogeneous data with an automatic quality control to produce reliable and comparable results. Therefore a data-processing pipeline is mandatory for data validation and comparison in a data-warehousing system. The proteome bioinformatics platform ProteinScape has been proven to cover these needs. The reprocessing of HUPO BPP participants’ MS data was done within ProteinScape. The reprocessed information was transferred into the global data repository PRIDE. ProteinScape as a data-warehousing system covers two main aspects: archiving relevant data of the proteomics workflow and information extraction functionality (protein identification, quantification and generation of biological knowledge). As a strategy for automatic data validation, different protein search engines are integrated. Result analysis is performed using a decoy database search strategy, which allows the measurement of the false-positive identification rate. Peptide identifications across different workflows, different MS techniques, and different search engines are merged to obtain a quality-controlled protein list. The proteomics identifications database (PRIDE), as a public data repository, is an archiving system where data are finally stored and no longer changed by further processing steps. Data submission to PRIDE is open to proteomics laboratories generating protein and peptide identifications. An export tool has been developed for transferring all relevant HUPO BPP data from ProteinScape into PRIDE using the PRIDE.xml format. The EU-funded ProDac project will coordinate the development of software tools covering international standards for the representation of proteomics data. The implementation of data submission pipelines and systematic data collection in public standards–compliant repositories will cover all aspects, from the generation of MS data in each laboratory to the conversion of all the annotating information and identifications to a standardized format. Such datasets can be used in the course of publishing in scientific journals.
Using Linked Open Data and Semantic Integration to Search Across Geoscience Repositories
NASA Astrophysics Data System (ADS)
Mickle, A.; Raymond, L. M.; Shepherd, A.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Hitzler, P.; Janowicz, K.; Jones, M.; Krisnadhi, A.; Lehnert, K. A.; Narock, T.; Schildhauer, M.; Wiebe, P. H.
2014-12-01
The MBLWHOI Library is a partner in the OceanLink project, an NSF EarthCube Building Block, applying semantic technologies to enable knowledge discovery, sharing and integration. OceanLink is testing ontology design patterns that link together: two data repositories, Rolling Deck to Repository (R2R), Biological and Chemical Oceanography Data Management Office (BCO-DMO); the MBLWHOI Library Institutional Repository (IR) Woods Hole Open Access Server (WHOAS); National Science Foundation (NSF) funded awards; and American Geophysical Union (AGU) conference presentations. The Library is collaborating with scientific users, data managers, DSpace engineers, experts in ontology design patterns, and user interface developers to make WHOAS, a DSpace repository, linked open data enabled. The goal is to allow searching across repositories without any of the information providers having to change how they manage their collections. The tools developed for DSpace will be made available to the community of users. There are 257 registered DSpace repositories in the United Stated and over 1700 worldwide. Outcomes include: Integration of DSpace with OpenRDF Sesame triple store to provide SPARQL endpoint for the storage and query of RDF representation of DSpace resources, Mapping of DSpace resources to OceanLink ontology, and DSpace "data" add on to provide resolvable linked open data representation of DSpace resources.
Ontology-Based Annotation of Learning Object Content
ERIC Educational Resources Information Center
Gasevic, Dragan; Jovanovic, Jelena; Devedzic, Vladan
2007-01-01
The paper proposes a framework for building ontology-aware learning object (LO) content. Previously ontologies were exclusively employed for enriching LOs' metadata. Although such an approach is useful, as it improves retrieval of relevant LOs from LO repositories, it does not enable one to reuse components of a LO, nor to incorporate an explicit…
Mining Very High Resolution INSAR Data Based On Complex-GMRF Cues And Relevance Feedback
NASA Astrophysics Data System (ADS)
Singh, Jagmal; Popescu, Anca; Soccorsi, Matteo; Datcu, Mihai
2012-01-01
With the increase in number of remote sensing satellites, the number of image-data scenes in our repositories is also increasing and a large quantity of these scenes are never received and used. Thus automatic retrieval of de- sired image-data using query by image content to fully utilize the huge repository volume is becoming of great interest. Generally different users are interested in scenes containing different kind of objects and structures. So its important to analyze all the image information mining (IIM) methods so that its easier for user to select a method depending upon his/her requirement. We concentrate our study only on high-resolution SAR images and we propose to use InSAR observations instead of only one single look complex (SLC) images for mining scenes containing coherent objects such as high-rise buildings. However in case of objects with less coherence like areas with vegetation cover, SLC images exhibits better performance. We demonstrate IIM performance comparison using complex-Gauss Markov Random Fields as texture descriptor for image patches and SVM relevance- feedback.
NASA Astrophysics Data System (ADS)
Gautschi, Andreas
2017-09-01
In Switzerland, the Opalinus Clay - a Jurassic (Aalenian) claystone formation - has been proposed as the first-priority host rock for a deep geological repository for both low- and intermediate-level and high-level radioactive wastes. An extensive site and host rock investigation programme has been carried out during the past 30 years in Northern Switzerland, comprising extensive 2D and 3D seismic surveys, a series of deep boreholes within and around potential geological siting regions, experiments in the international Mont Terri Rock Laboratory, compilations of data from Opalinus Clay in railway and motorway tunnels and comparisons with similar rocks. The hydrogeological properties of the Opalinus Clay that are relevant from the viewpoint of long-term safety are described and illustrated. The main conclusions are supported by multiple lines of evidence, demonstrating consistency of conclusions based on hydraulic properties, porewater chemistry, distribution of natural tracers across the Opalinus Clay as well as small- and large-scale diffusion models and the derived conceptual understanding of solute transport.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
The Environmental Data Initiative (EDI) is an outgrowth of more than 30 years of information management experience and technology from LTER Network data practitioners. EDI builds upon the PASTA data repository software used by the LTER Network Information System and manages more than 42,000 data packages, containing tabular data, imagery, and other formats. Development of the repository was a community process beginning in 2009 that included numerous working groups for generating use cases, system requirements, and testing of completed software, thereby creating a vested interested in its success and transparency in design. All software is available for review on GitHub, and refinements and new features are ongoing. Documentation is also available on Read-the-docs, including a comprehensive description of all web-service API methods. PASTA is metadata driven and uses the Ecological Metadata Language (EML) standard for describing environmental and ecological data; a simplified Dublin Core document is also available for each data package. Data are aggregated into packages consisting of metadata and other related content described by an OAI-ORE document. Once archived, each data package becomes immutable and permanent; updates are possible through the addition of new revisions. Components of each data package are accessible through a unique identifier, while the entire data package receives a DOI that is registered in DataCite. Preservation occurs through a combination of DataONE synchronization/replication and by a series of local and remote backup strategies, including daily uploads to AWS Glacier storage. Checksums are computed for all data at initial upload, with random verification occurring on a continuous basis, thus ensuring the integrity of data. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML before data are archived; data packages that fail any test are forbidden in the repository. These tests are a measure data fitness, which ultimately increases confidence in data reuse and synthesis. The EDI data repository is recognized by multiple organizations, including EarthCube's Council of Data Facilities, the United States Geological Survey, FAIRsharing.org, re3data.org, and is a PLOS and Nature recommended data repository.
Deep Boreholes Seals Subjected to High P,T conditions - Proposed Experimental Studies
NASA Astrophysics Data System (ADS)
Caporuscio, F.
2015-12-01
Deep borehole experimental work will constrain the P,T conditions which "seal" material will experience in deep borehole crystalline rock repositories. The rocks of interest to this study include mafic (amphibolites) and silicic (granitic gneiss) end members. The experiments will systematically add components to capture discrete changes in both water and EBS component chemistries. Experiments in the system wall rock-clay-concrete-groundwater will evaluate interactions among components, including: mineral phase stability, metal corrosion rates and thermal limits. Based on engineered barrier studies, experimental investigations will move forward with three focusses. First, evaluation of interaction between "seal" materials and repository wall rock (crystalline) under fluid-saturated conditions over long-term (i.e., six-month) experiments; which reproduces the thermal pulse event of a repository. Second, perform experiments to determine the stability of zeolite minerals (analcime-wairakitess) under repository conditions. Both sets of experiments are critically important for understanding mineral paragenesis (zeolites and/or clay transformations) associated with "seals" in contact with wall rock at elevated temperatures. Third, mineral growth at the metal interface is a principal control on the survivability (i.e. corrosion) of waste canisters in a repository. The objective of this planned experimental work is to evaluate physio-chemical processes for 'seal' components and materials relevant to deep borehole disposal. These evaluations will encompass multi-laboratory efforts for the development of seals concepts and application of Thermal-Mechanical-Chemical (TMC) modeling work to assess barrier material interactions with subsurface fluids and other barrier materials, their stability at high temperatures, and the implications of these processes to the evaluation of thermal limits.
Li, Yanjie; Polak, Urszula; Clark, Amanda D; Bhalla, Angela D; Chen, Yu-Yun; Li, Jixue; Farmer, Jennifer; Seyer, Lauren; Lynch, David; Butler, Jill S; Napierala, Marek
2016-08-01
Friedreich's ataxia (FRDA) represents a rare neurodegenerative disease caused by expansion of GAA trinucleotide repeats in the first intron of the FXN gene. The number of GAA repeats in FRDA patients varies from approximately 60 to <1000 and is tightly correlated with age of onset and severity of the disease symptoms. The heterogeneity of Friedreich's ataxia stresses the need for a large cohort of patient samples to conduct studies addressing the mechanism of disease pathogenesis or evaluate novel therapeutic candidates. Herein, we report the establishment and characterization of an FRDA fibroblast repository, which currently includes 50 primary cell lines derived from FRDA patients and seven lines from mutation carriers. These cells are also a source for generating induced pluripotent stem cell (iPSC) lines by reprogramming, as well as disease-relevant neuronal, cardiac, and pancreatic cells that can then be differentiated from the iPSCs. All FRDA and carrier lines are derived using a standard operating procedure and characterized to confirm mutation status, as well as expression of FXN mRNA and protein. Consideration and significance of creating disease-focused cell line and tissue repositories, especially in the context of rare and heterogeneous disorders, are presented. Although the economic aspect of creating and maintaining such repositories is important, the benefits of easy access to a collection of well-characterized cell lines for the purpose of drug discovery or disease mechanism studies overshadow the associated costs. Importantly, all FRDA fibroblast cell lines collected in our repository are available to the scientific community.
Li, Yanjie; Polak, Urszula; Clark, Amanda D.; Bhalla, Angela D.; Chen, Yu-Yun; Li, Jixue; Farmer, Jennifer; Seyer, Lauren; Lynch, David
2016-01-01
Friedreich's ataxia (FRDA) represents a rare neurodegenerative disease caused by expansion of GAA trinucleotide repeats in the first intron of the FXN gene. The number of GAA repeats in FRDA patients varies from approximately 60 to <1000 and is tightly correlated with age of onset and severity of the disease symptoms. The heterogeneity of Friedreich's ataxia stresses the need for a large cohort of patient samples to conduct studies addressing the mechanism of disease pathogenesis or evaluate novel therapeutic candidates. Herein, we report the establishment and characterization of an FRDA fibroblast repository, which currently includes 50 primary cell lines derived from FRDA patients and seven lines from mutation carriers. These cells are also a source for generating induced pluripotent stem cell (iPSC) lines by reprogramming, as well as disease-relevant neuronal, cardiac, and pancreatic cells that can then be differentiated from the iPSCs. All FRDA and carrier lines are derived using a standard operating procedure and characterized to confirm mutation status, as well as expression of FXN mRNA and protein. Consideration and significance of creating disease-focused cell line and tissue repositories, especially in the context of rare and heterogeneous disorders, are presented. Although the economic aspect of creating and maintaining such repositories is important, the benefits of easy access to a collection of well-characterized cell lines for the purpose of drug discovery or disease mechanism studies overshadow the associated costs. Importantly, all FRDA fibroblast cell lines collected in our repository are available to the scientific community. PMID:27002638
The U.S. Department of Energy was studying the feasibility of locating a high-level radioactive waste repository in basalt at the Hanford site in south-central Washington. This is a saturated site where ground water transport of radionuclides away from a repository is the mechani...
Uranium (VI) solubility in carbonate-free ERDA-6 brine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucchini, Jean-francois; Khaing, Hnin; Reed, Donald T
2010-01-01
When present, uranium is usually an element of importance in a nuclear waste repository. In the Waste Isolation Pilot Plant (WIPP), uranium is the most prevalent actinide component by mass, with about 647 metric tons to be placed in the repository. Therefore, the chemistry of uranium, and especially its solubility in the WIPP conditions, needs to be well determined. Long-term experiments were performed to measure the solubility of uranium (VI) in carbonate-free ERDA-6 brine, a simulated WIPP brine, at pC{sub H+} values between 8 and 12.5. These data, obtained from the over-saturation approach, were the first repository-relevant data for themore » VI actinide oxidation state. The solubility trends observed pointed towards low uranium solubility in WIPP brines and a lack of amphotericity. At the expected pC{sub H+} in the WIPP ({approx} 9.5), measured uranium solubility approached 10{sup -7} M. The objective of these experiments was to establish a baseline solubility to further investigate the effects of carbonate complexation on uranium solubility in WIPP brines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
WANG,YIFENG; XU,HUIFANG
Correctly identifying the possible alteration products and accurately predicting their occurrence in a repository-relevant environment are the key for the source-term calculation in a repository performance assessment. Uraninite in uranium deposits has long been used as a natural analog to spent fuel in a repository because of their chemical and structural similarity. In this paper, a SEM/AEM investigation has been conducted on a partially alternated uraninite sample from a uranium ore deposit of Shinkolobwe of Congo. The mineral formation sequences were identified: uraninite {yields} uranyl hydrates {yields} uranyl silicates {yields} Ca-uranyl silicates or uraninite {yields} uranyl silicates {yields} Ca-uranyl silicates.more » Reaction-path calculations were conducted for the oxidative dissolution of spent fuel in a representative Yucca Mountain groundwater. The predicted sequence is in general consistent with the SEM observations. The calculations also show that uranium carbonate minerals are unlikely to become major solubility-controlling mineral phases in a Yucca Mountain environment. Some discrepancies between model predictions and field observations are observed. Those discrepancies may result from poorly constrained thermodynamic data for uranyl silicate minerals.« less
Extreme ground motions and Yucca Mountain
Hanks, Thomas C.; Abrahamson, Norman A.; Baker, Jack W.; Boore, David M.; Board, Mark; Brune, James N.; Cornell, C. Allin; Whitney, John W.
2013-01-01
Yucca Mountain is the designated site of the underground repository for the United States' high-level radioactive waste (HLW), consisting of commercial and military spent nuclear fuel, HLW derived from reprocessing of uranium and plutonium, surplus plutonium, and other nuclear-weapons materials. Yucca Mountain straddles the western boundary of the Nevada Test Site, where the United States has tested nuclear devices since the 1950s, and is situated in an arid, remote, and thinly populated region of Nevada, ~100 miles northwest of Las Vegas. Yucca Mountain was originally considered as a potential underground repository of HLW because of its thick units of unsaturated rocks, with the repository horizon being not only ~300 m above the water table but also ~300 m below the Yucca Mountain crest. The fundamental rationale for a geologic (underground) repository for HLW is to securely isolate these materials from the environment and its inhabitants to the greatest extent possible and for very long periods of time. Given the present climate conditions and what is known about the current hydrologic system and conditions around and in the mountain itself, one would anticipate that the rates of infiltration, corrosion, and transport would be very low—except for the possibility that repository integrity might be compromised by low-probability disruptive events, which include earthquakes, strong ground motion, and (or) a repository-piercing volcanic intrusion/eruption. Extreme ground motions (ExGM), as we use the phrase in this report, refer to the extremely large amplitudes of earthquake ground motion that arise at extremely low probabilities of exceedance (hazard). They first came to our attention when the 1998 probabilistic seismic hazard analysis for Yucca Mountain was extended to a hazard level of 10-8/yr (a 10-4/yr probability for a 104-year repository “lifetime”). The primary purpose of this report is to summarize the principal results of the ExGM research program as they have developed over the past 5 years; what follows will be focused on Yucca Mountain, but not restricted to it.
Breytenbach, Amelia; Lourens, Antoinette; Marsh, Susan
2013-04-26
The history of veterinary science in South Africa can only be appreciated, studied, researched and passed on to coming generations if historical sources are readily available. In most countries, material and sources with historical value are often difficult to locate, dispersed over a large area and not part of the conventional book and journal literature. The Faculty of Veterinary Science of the University of Pretoria and its library has access to a large collection of historical sources. The collection consists of photographs, photographic slides, documents, proceedings, posters, audio-visual material, postcards and other memorabilia. Other institutions in the country are also approached if relevant sources are identified in their collections. The University of Pretoria's institutional repository, UPSpace, was launched in 2006. This provided the Jotello F. Soga Library with the opportunity to fill the repository with relevant digitised collections of diverse heritage and learning resources that can contribute to the long-term preservation and accessibility of historical veterinary sources. These collections are available for use not only by historians and researchers in South Africa but also elsewhere in Africa and the rest of the world. Important historical collections such as the Arnold Theiler collection, the Jotello F. Soga collection and collections of the Onderstepoort Journal of Veterinary Research and the Journal of the South African Veterinary Association are highlighted. The benefits of an open access digital repository, the importance of collaboration across the veterinary community and other prerequisites for the sustainability of a digitisation project and the importance of metadata to enhance accessibility are covered.
10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.
Code of Federal Regulations, 2013 CFR
2013-01-01
....3-2-2 Section 960.3-2-2 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... shall be based on evaluations in accordance with the guidelines of this part, and the bases and relevant...
10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.
Code of Federal Regulations, 2012 CFR
2012-01-01
....3-2-2 Section 960.3-2-2 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... shall be based on evaluations in accordance with the guidelines of this part, and the bases and relevant...
10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.
Code of Federal Regulations, 2014 CFR
2014-01-01
....3-2-2 Section 960.3-2-2 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... shall be based on evaluations in accordance with the guidelines of this part, and the bases and relevant...
10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.
Code of Federal Regulations, 2011 CFR
2011-01-01
....3-2-2 Section 960.3-2-2 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... shall be based on evaluations in accordance with the guidelines of this part, and the bases and relevant...
Insights About Persons: Psychological Foundations of Humanistic and Affective Education.
ERIC Educational Resources Information Center
Patterson, Cecil H.
This chapter reviews selected materials in psychology that are related to the nature of man and his development and that are relevant to a humanistic system of education. Humanistic is used to indicate a concern with the learner as a whole person rather than simply as a disembodied intellect or repository of cognitive processes. In reviewing the…
SNOMED CT module-driven clinical archetype management.
Allones, J L; Taboada, M; Martinez, D; Lozano, R; Sobrido, M J
2013-06-01
To explore semantic search to improve management and user navigation in clinical archetype repositories. In order to support semantic searches across archetypes, an automated method based on SNOMED CT modularization is implemented to transform clinical archetypes into SNOMED CT extracts. Concurrently, query terms are converted into SNOMED CT concepts using the search engine Lucene. Retrieval is then carried out by matching query concepts with the corresponding SNOMED CT segments. A test collection of the 16 clinical archetypes, including over 250 terms, and a subset of 55 clinical terms from two medical dictionaries, MediLexicon and MedlinePlus, were used to test our method. The keyword-based service supported by the OpenEHR repository offered us a benchmark to evaluate the enhancement of performance. In total, our approach reached 97.4% precision and 69.1% recall, providing a substantial improvement of recall (more than 70%) compared to the benchmark. Exploiting medical domain knowledge from ontologies such as SNOMED CT may overcome some limitations of the keyword-based systems and thus improve the search experience of repository users. An automated approach based on ontology segmentation is an efficient and feasible way for supporting modeling, management and user navigation in clinical archetype repositories. Copyright © 2013 Elsevier Inc. All rights reserved.
Implementing DSpace at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Lowe, Greta
2007-01-01
This presentation looks at the implementation of the DSpace institutional repository system at the NASA Langley Technical Library. NASA Langley Technical Library implemented DSpace software as a replacement for the Langley Technical Report Server (LTRS). DSpace was also used to develop the Langley Technical Library Digital Repository (LTLDR). LTLDR contains archival copies of core technical reports in the aeronautics area dating back to the NACA era and other specialized collections relevant to the NASA Langley community. Extensive metadata crosswalks were created to facilitate moving data from various systems and formats to DSpace. The Dublin Core metadata screens were also customized. The OpenURL standard and Ex Libris Metalib are being used in this environment to assist our customers with either discovering full-text content or with initiating a request for the item.
Software aspects of the Geant4 validation repository
NASA Astrophysics Data System (ADS)
Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto
2017-10-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
Software Aspects of the Geant4 Validation Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dotti, Andrea; Wenzel, Hans; Elvira, Daniel
2016-01-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
NASA Astrophysics Data System (ADS)
Huang, Wei-Hsing
2017-04-01
Clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation behavior of clay barrier, with emphasis on the coupling effects of heat and moisture of buffer material in the near-field of a repository during groundwater intrusion processes. A locally available clay named "Zhisin clay" and a standard bentotine material were adopted in the laboratory program. Water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the buffer material. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. The finite element program ABAQUS was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on Zhisin clay. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. It was found that, due to the variation in suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.
NASA Astrophysics Data System (ADS)
Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji
2017-01-01
In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.
Information Analysis Centers in the Department of Defense. Revision
1987-07-01
Combat Data Information Center (CDIC) and the Aircraft Survivability Model Repository ( ASMR ) into the Survivability/Vulnerability Information Analysis...Information Center (CDIC) and the Aircraft Survivability Model Respository ( ASMR ). The CDIC was a central repository for combat and test data related to...and ASMR were operated under the technical monitorship of the Flight Dynamics Laboratory at Wright-Patterson AFB, Ohio and were located in Flight
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaller, A.; Skanata, D.
1995-12-31
Site selection approach to radioactive waste disposal facility, which is under way in Croatia, is presented in the paper. This approach is based on application of certain relevant terrestrial and technical criteria in the site selection process. Basic documentation used for this purpose are regional planning documents prepared by the Regional Planning Institute of Croatia. The basic result of research described in the paper is the proposal of several potential areas which are suitable for siting a radioactive waste repository. All relevant conclusions are based on both data groups -- generic and on-field experienced (measured). Out of a dozen potentialmore » areas, four have been chosen as representative by the authors. The presented comparative analysis was made by means of the VISA II computer code, developed by the V. Belton and SPV Software Products. The code was donated to the APO by the IAEA. The main objective of the paper is to initiate and facilitate further discussions on possible ways of evaluation and comparison of potential areas for sitting of radioactive waste repository in this country, as well as to provide additional contributions to the current site selection process in the Republic of Croatia.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Leigh, Christi; Stein, Walter
The 5th US/German Workshop on Salt Repository Research, Design, and Operation was held in Santa Fe New Mexico September 8-10, 2014. The forty seven registered participants were equally divided between the United States (US) and Germany, with one participant from The Netherlands. The agenda for the 2014 workshop was under development immediately upon finishing the 4th Workshop. Ongoing, fundamental topics such as thermomechanical behavior of salt, plugging and sealing, the safety case, and performance assessment continue to advance the basis for disposal of heat-generating nuclear waste in salt formations. The utility of a salt underground research laboratory (URL) remains anmore » intriguing concept engendering discussion of testing protocol. By far the most interest in this years’ workshop pertained to operational safety. Given events at the Waste Isolation Pilot Plant (WIPP), this discussion took on a new sense of relevance and urgency.« less
Testing of candidate waste-package backfill and canister materials for basalt
NASA Astrophysics Data System (ADS)
Wood, M. I.; Anderson, W. J.; Aden, G. D.
1982-09-01
The Basalt Waste Isolation Project (BWIP) is developing a multiple-barrier waste package to contain high-level nuclear waste as part of an overall system (e.g., waste package, repository sealing system, and host rock) designed to isolate the waste in a repository located in basalt beneath the Hanford Site, Richland, Washington. The three basic components of the waste package are the waste form, the canister, and the backfill. An extensive testing program is under way to determine the chemical, physical, and mechanical properties of potential canister and backfill materials. The data derived from this testing program will be used to recommend those materials that most adequately perform the functions assigned to the canister and backfill.
Rolling Deck to Repository I: Designing a Database Infrastructure
NASA Astrophysics Data System (ADS)
Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.
2008-12-01
The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.
Improving Scientific Metadata Interoperability And Data Discoverability using OAI-PMH
NASA Astrophysics Data System (ADS)
Devarakonda, Ranjeet; Palanisamy, Giri; Green, James M.; Wilson, Bruce E.
2010-12-01
While general-purpose search engines (such as Google or Bing) are useful for finding many things on the Internet, they are often of limited usefulness for locating Earth Science data relevant (for example) to a specific spatiotemporal extent. By contrast, tools that search repositories of structured metadata can locate relevant datasets with fairly high precision, but the search is limited to that particular repository. Federated searches (such as Z39.50) have been used, but can be slow and the comprehensiveness can be limited by downtime in any search partner. An alternative approach to improve comprehensiveness is for a repository to harvest metadata from other repositories, possibly with limits based on subject matter or access permissions. Searches through harvested metadata can be extremely responsive, and the search tool can be customized with semantic augmentation appropriate to the community of practice being served. However, there are a number of different protocols for harvesting metadata, with some challenges for ensuring that updates are propagated and for collaborations with repositories using differing metadata standards. The Open Archive Initiative Protocol for Metadata Handling (OAI-PMH) is a standard that is seeing increased use as a means for exchanging structured metadata. OAI-PMH implementations must support Dublin Core as a metadata standard, with other metadata formats as optional. We have developed tools which enable our structured search tool (Mercury; http://mercury.ornl.gov) to consume metadata from OAI-PMH services in any of the metadata formats we support (Dublin Core, Darwin Core, FCDC CSDGM, GCMD DIF, EML, and ISO 19115/19137). We are also making ORNL DAAC metadata available through OAI-PMH for other metadata tools to utilize, such as the NASA Global Change Master Directory, GCMD). This paper describes Mercury capabilities with multiple metadata formats, in general, and, more specifically, the results of our OAI-PMH implementations and the lessons learned. References: [1] R. Devarakonda, G. Palanisamy, B.E. Wilson, and J.M. Green, "Mercury: reusable metadata management data discovery and access system", Earth Science Informatics, vol. 3, no. 1, pp. 87-94, May 2010. [2] R. Devarakonda, G. Palanisamy, J.M. Green, B.E. Wilson, "Data sharing and retrieval using OAI-PMH", Earth Science Informatics DOI: 10.1007/s12145-010-0073-0, (2010). [3] Devarakonda, R.; Palanisamy, G.; Green, J.; Wilson, B. E. "Mercury: An Example of Effective Software Reuse for Metadata Management Data Discovery and Access", Eos Trans. AGU, 89(53), Fall Meet. Suppl., IN11A-1019 (2008).
A recommendation module to help teachers build courses through the Moodle Learning Management System
NASA Astrophysics Data System (ADS)
Limongelli, Carla; Lombardi, Matteo; Marani, Alessandro; Sciarrone, Filippo; Temperini, Marco
2016-01-01
In traditional e-learning, teachers design sets of Learning Objects (LOs) and organize their sequencing; the material implementing the LOs could be either built anew or adopted from elsewhere (e.g. from standard-compliant repositories) and reused. This task is applicable also when the teacher works in a system for personalized e-learning. In this case, the burden actually increases: for instance, the LOs may need adaptation to the system, through additional metadata. This paper presents a module that gives some support to the operations of retrieving, analyzing, and importing LOs from a set of standard Learning Objects Repositories, acting as a recommending system. In particular, it is designed to support the teacher in the phases of (i) retrieval of LOs, through a keyword-based search mechanism applied to the selected repositories; (ii) analysis of the returned LOs, whose information is enriched by a concept of relevance metric, based on both the results of the searching operation and the data related to the previous use of the LOs in the courses managed by the Learning Management System; and (iii) LO importation into the course under construction.
Active Exploration of Large 3D Model Repositories.
Gao, Lin; Cao, Yan-Pei; Lai, Yu-Kun; Huang, Hao-Zhi; Kobbelt, Leif; Hu, Shi-Min
2015-12-01
With broader availability of large-scale 3D model repositories, the need for efficient and effective exploration becomes more and more urgent. Existing model retrieval techniques do not scale well with the size of the database since often a large number of very similar objects are returned for a query, and the possibilities to refine the search are quite limited. We propose an interactive approach where the user feeds an active learning procedure by labeling either entire models or parts of them as "like" or "dislike" such that the system can automatically update an active set of recommended models. To provide an intuitive user interface, candidate models are presented based on their estimated relevance for the current query. From the methodological point of view, our main contribution is to exploit not only the similarity between a query and the database models but also the similarities among the database models themselves. We achieve this by an offline pre-processing stage, where global and local shape descriptors are computed for each model and a sparse distance metric is derived that can be evaluated efficiently even for very large databases. We demonstrate the effectiveness of our method by interactively exploring a repository containing over 100 K models.
Development of anomaly detection models for deep subsurface monitoring
NASA Astrophysics Data System (ADS)
Sun, A. Y.
2017-12-01
Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.
iAnn: an event sharing platform for the life sciences.
Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel
2013-08-01
We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.
An infrastructure for ontology-based information systems in biomedicine: RICORDO case study.
Wimalaratne, Sarala M; Grenon, Pierre; Hoehndorf, Robert; Gkoutos, Georgios V; de Bono, Bernard
2012-02-01
The article presents an infrastructure for supporting the semantic interoperability of biomedical resources based on the management (storing and inference-based querying) of their ontology-based annotations. This infrastructure consists of: (i) a repository to store and query ontology-based annotations; (ii) a knowledge base server with an inference engine to support the storage of and reasoning over ontologies used in the annotation of resources; (iii) a set of applications and services allowing interaction with the integrated repository and knowledge base. The infrastructure is being prototyped and developed and evaluated by the RICORDO project in support of the knowledge management of biomedical resources, including physiology and pharmacology models and associated clinical data. The RICORDO toolkit and its source code are freely available from http://ricordo.eu/relevant-resources. sarala@ebi.ac.uk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tharrington, Arnold N.
2015-09-09
The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.
Enabling Open Research Data Discovery through a Recommender System
NASA Astrophysics Data System (ADS)
Devaraju, Anusuriya; Jayasinghe, Gaya; Klump, Jens; Hogan, Dominic
2017-04-01
Government agencies, universities, research and nonprofit organizations are increasingly publishing their datasets to promote transparency, induce new research and generate economic value through the development of new products or services. The datasets may be downloaded from various data portals (data repositories) which are general or domain-specific. The Registry of Research Data Repository (re3data.org) lists more than 2500 such data repositories from around the globe. Data portals allow keyword search and faceted navigation to facilitate discovery of research datasets. However, the volume and variety of datasets have made finding relevant datasets more difficult. Common dataset search mechanisms may be time consuming, may produce irrelevant results and are primarily suitable for users who are familiar with the general structure and contents of the respective database. Therefore, we need new approaches to support research data discovery. Recommender systems offer new possibilities for users to find datasets that are relevant to their research interests. This study presents a recommender system developed for the CSIRO Data Access Portal (DAP, http://data.csiro.au). The datasets hosted on the portal are diverse, published by researchers from 13 business units in the organisation. The goal of the study is not to replace the current search mechanisms on the data portal, but rather to extend the data discovery through an exploratory search, in this case by building a recommender system. We adopted a hybrid recommendation approach, comprising content-based filtering and item-item collaborative filtering. The content-based filtering computes similarities between datasets based on metadata such as title, keywords, descriptions, fields of research, location, contributors, etc. The collaborative filtering utilizes user search behaviour and download patterns derived from the server logs to determine similar datasets. Similarities above are then combined with different degrees of importance (weights) to determine the overall data similarity. We determined the similarity weights based on a survey involving 150 users of the portal. The recommender results for a given dataset are accessible programmatically via a RESTful web service. An offline evaluation involving data users demonstrates the ability of the recommender system to discover relevant and 'novel' datasets.
NASA Astrophysics Data System (ADS)
Lugmayr, Artur R.; Mailaparampil, Anurag; Tico, Florina; Kalli, Seppo; Creutzburg, Reiner
2003-01-01
Digital television (digiTV) is an additional multimedia environment, where metadata is one key element for the description of arbitrary content. This implies adequate structures for content description, which is provided by XML metadata schemes (e.g. MPEG-7, MPEG-21). Content and metadata management is the task of a multimedia repository, from which digiTV clients - equipped with an Internet connection - can access rich additional multimedia types over an "All-HTTP" protocol layer. Within this research work, we focus on conceptual design issues of a metadata repository for the storage of metadata, accessible from the feedback channel of a local set-top box. Our concept describes the whole heterogeneous life-cycle chain of XML metadata from the service provider to the digiTV equipment, device independent representation of content, accessing and querying the metadata repository, management of metadata related to digiTV, and interconnection of basic system components (http front-end, relational database system, and servlet container). We present our conceptual test configuration of a metadata repository that is aimed at a real-world deployment, done within the scope of the future interaction (fiTV) project at the Digital Media Institute (DMI) Tampere (www.futureinteraction.tv).
NASA Astrophysics Data System (ADS)
Biggin, C.; Ota, K.; Siittari-Kauppi, M.; Moeri, A.
2004-12-01
In the context of a repository for radioactive waste, 'matrix diffusion' is used to describe the process by which solute, flowing in distinct flow paths, penetrates the surrounding rock matrix. Diffusion into the matrix occurs in a connected system of pores or microfractures. Matrix diffusion provides a mechanism for greatly enlarging the area of rock surface in contact with advecting radionuclides, from that of the flow path surfaces (and infills), to a much larger portion of the bulk rock and increases the global pore volume which can retard radionuclides. In terms of a repository safety assessment, demonstration of a significant depth of diffusion-accessible pore space may result in a significant delay in the calculated release of any escaping radionuclides to the environment and a dramatic reduction in the resulting concentration released into the biosphere. For the last decade, Nagra has investigated in situ matrix diffusion at the Grimsel Test Site (GTS) in the Swiss Alps. The in situ investigations offer two distinct advantages to those performed in the lab, namely: 1. Lab-based determination of porosity and diffusivity can lead to an overestimation of matrix diffusion due to stress relief when the rock is sampled (which would overestimate the retardation in the geosphere) 2. Lab-based analysis usually examines small (cm scale) samples and cannot therefore account for any matrix heterogeneity over the hundreds or thousands of metres a typical flow path The in situ investigations described began with the Connected Porosity project, wherein a specially developed acrylic resin was injected into the rock matrix to fill the pore space and determine the depth of connected porosity. The resin was polymerised in situ and the entire rock mass removed by overcoring. The results indicated that lab-based porosity measurements may be two to three times higher than those obtained in situ. While the depth of accessible matrix from a water-conducting feature assumed in repository performance assessments is generally 1 to 10 cm, the results from the GTS in situ experiment suggested depths of several metres could be more appropriate. More recently, the Pore Space Geometry (PSG) experiment at the GTS has used a C-14 doped acrylic resin, combined with state-of-the-art digital beta autoradiography and fluorescence detection to examine a larger area of rock for determination of porosity and the degree of connected pore space. Analysis is currently ongoing and the key findings will be reported in this paper. Starting at the GTS in 2005, the Long-term Diffusion (LTD) project will investigate such processes over spatial and temporal scales more relevant to a repository than traditional lab-based experiments. In the framework of this experiment, long-term (10 to 50 years) in situ diffusion experiments and resin injection experiments are planned to verify current models for matrix diffusion as a radionuclide retardation process. This paper will discuss the findings of the first two experiments and their significance to repository safety assessments before discussing the strategy for the future in relation to the LTD project.
Looking for Skeletons in the Data Centre `Cupboard': How Repository Certification Can Help
NASA Astrophysics Data System (ADS)
Sorvari, S.; Glaves, H.
2017-12-01
There has been a national geoscience repository at the British Geological Survey (or one of its previous incarnations) almost since its inception in 1835. This longevity has resulted in vast amounts of analogue material and, more recently, digital data some of which has been collected by our scientists but much more has been acquired either through various legislative obligations or donated from various sources. However, the role and operation of the UK National Geoscience Data Centre (NGDC) in the 21st Century is very different to that of the past, with new systems and procedures dealing with predominantly digital data. A web-based ingestion portal allows users to submit their data directly to the NGDC while online services provide discovery and access to data and derived products. Increasingly we are also required to implement an array of standards e.g. ISO, OGC, W3C, best practices e.g. FAIR and legislation e.g. EU INSPIRE Directive; whilst at the same time needing to justifying our very existence to our funding agency and hosting organisation. External pressures to demonstrate that we can be recognised as a trusted repository by researchers, various funding agencies, publishers and other related entities have forced us to look at how we function, and to benchmark our operations against those of other organisations and current relevant standards such as those laid down by different repository certification processes. Following an assessment of the various options, the WDS/DSA certification process was selected as the most appropriate route for accreditation of NGDC as a trustworthy repository. It provided a suitable framework for reviewing the current systems, procedures and best practices. Undertaking this process allowed us to identify where the NGDC already has robust systems in place and where there were gaps and deficiencies in current practices. The WDS/DSA assessment process also helped to reinforce best practice throughout the NGDC and demonstrated that many of the recognised and required procedures and standards for recognition as a trusted repository were already in place, even if they were not always followed!
NASA Astrophysics Data System (ADS)
Iafrate, G.; Ramella, M.; Boch, T.; Bonnarel, F.; Chèreau, F.; Fernique, P.; Osuna, P.
2009-04-01
We present preliminary simple interfaces developed to enable students, teachers, amateur astronomers and general public to access and use the wealth of astronomical data available in ground-based and space archives through the European Virtual Observatory (EuroVO). The development of these outreach interfaces are the aim of a workpackage of EuroVO-AIDA (Astronomical Infrastructure for Data Access), a project supported by EU in the framework of the FP7 Infrastructure Scientific Research Repositories initiative (project RI2121104). The aim of AIDA is to create an operating infrastructure enabling and stimulating new scientific usage of astronomy digital repositories. Euro VO AIDA is a collaboration between six European countries (PI Francoise Genova, CDS). The professional tools we adapt to the requirements of outreach activities are Aladin (CDS), Stellarium/VirGO (ESO) and VOSpec (ESA VO). Some initial requirements have been set a priori in order to produce a first version of the simplified interfaces, but the plan is to test the initial simplified versions with a sample of target users in order to take their feed-back into account for the development of the final outreach interface. The core of the test program consists of use cases we designed and complemented with proper multilingual documentation covering both the astrophysical context and the use of the software. In the special case of students in the age group 14-18 and their teachers, we take our use cases to schools. We work out the tests in classrooms supporting students working on PCs connected to the internet. At the current stage of the project, we are collecting the users feedback. Relevant links: Euro-VO AIDA Overview http://www.euro-vo.org/pub/aida/overview.html Euro-VO AIDA WP5 http://cds.u-strasbg.fr/twikiAIDA/bin/view/EuroVOAIDA/WP5WorkProgramme
Statistical sensitivity analysis of a simple nuclear waste repository model
NASA Astrophysics Data System (ADS)
Ronen, Y.; Lucius, J. L.; Blow, E. M.
1980-06-01
A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.
Bridging the Gap between Social Acceptance and Ethical Acceptability.
Taebi, Behnam
2017-10-01
New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological risk, particularly when we evaluate technologies with transnational and intergenerational risks. I argue that good governance of risky technology requires analyzing both social acceptance and ethical acceptability. Conceptually, these two notions are mostly complementary. Social acceptance studies are not capable of sufficiently capturing all the morally relevant features of risky technologies; ethical analyses do not typically include stakeholders' opinions, and they therefore lack the relevant empirical input for a thorough ethical evaluation. Only when carried out in conjunction are these two types of analysis relevant to national and international governance of risky technology. I discuss the Rawlsian wide reflective equilibrium as a method for marrying social acceptance and ethical acceptability. Although the rationale of my argument is broadly applicable, I will examine the case of multinational nuclear waste repositories in particular. This example will show how ethical issues may be overlooked if we focus only on social acceptance, and will provide a test case for demonstrating how the wide reflective equilibrium can help to bridge the proverbial acceptance-acceptability gap. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit
The modeling efforts in support of the field test planning conducted at LBNL leverage on recent developments of tools for modeling coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate and transport of water. These are modeling capabilities that will be suitable for assisting in the design of field experiment, especially related to multiphase flow processes coupled with mechanical deformations, at high temperature. In this report,more » we first examine previous generic repository modeling results, focusing on the first 20 years to investigate the expected evolution of the different processes that could be monitored in a full-scale heater experiment, and then present new results from ongoing modeling of the Thermal Simulation for Drift Emplacement (TSDE) experiment, a heater experiment on the in-drift emplacement concept at the Asse Mine, Germany, and provide an update on the ongoing model developments for modeling brine migration. LBNL also supported field test planning activities via contributions to and technical review of framework documents and test plans, as well as participation in workshops associated with field test planning.« less
Enthalpies of formation of polyhalite: A mineral relevant to salt repository
Guo, Xiaofeng; Xu, Hongwu
2017-06-02
Polyhalite is an important coexisting mineral with halite in salt repositories for nuclear waste disposal, such as Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. The thermal stability of this mineral is a key knowledge in evaluating the integrity of a salt repository in the long term, as water may release due to thermal decomposition of polyhalite. Previous studies on structural evolution of polyhalite at elevated temperatures laid the basis for detailed calorimetric measurements. Using high-temperature oxide-melt drop-solution calorimetry at 975 K with sodium molybdate as the solvent, we have determined the standard enthalpies of formation from constituent sulfatesmore » (ΔH° f,sul), oxides (ΔH° f,ox) and elements (ΔH° f,ele) of a polyhalite sample with the composition of K 2Ca 2Mg(SO 4) 4·1.95H 2O from the Salado formation at the WIPP site. The obtained results are: ΔH° f,sul = -152.5 ± 5.3 kJ/mol, ΔH° f,ox = -1926.1 ± 10.5 kJ/mol, and ΔH° f,ele = -6301.2 ± 9.9 kJ/mol. Furthermore, based on the estimated formation entropies of polyhalite, its standard Gibbs free energy of formation has been derived to be in the range of -5715.3 ± 9.9 kJ/mol to -5739.3 ± 9.9 kJ/mol. In conclusion, these determined thermodynamic properties provide fundamental parameters for modeling the stability behavior of polyhalite in salt repositories.« less
A Distributed Multi-Agent System for Collaborative Information Management and Learning
NASA Technical Reports Server (NTRS)
Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.; Koga, Dennis (Technical Monitor)
2000-01-01
In this paper, we present DIAMS, a system of distributed, collaborative agents to help users access, manage, share and exchange information. A DIAMS personal agent helps its owner find information most relevant to current needs. It provides tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Flexible hierarchical display is integrated with indexed query search-to support effective information access. Automatic indexing methods are employed to support user queries and communication between agents. Contents of a repository are kept in object-oriented storage to facilitate information sharing. Collaboration between users is aided by easy sharing utilities as well as automated information exchange. Matchmaker agents are designed to establish connections between users with similar interests and expertise. DIAMS agents provide needed services for users to share and learn information from one another on the World Wide Web.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, Lydia Ilaiza, E-mail: lydiailaiza@gmail.com; Ryong, Kim Tae
The whole cycle of the decommissioning process development of repository requires the relevant bodies to have a financial system to ensure that it has sufficient funds for its whole life cycle (over periods of many decades). Therefore, the financing mechanism and management system shall respect the following status: the national position, institutional and legislative environment, technical capabilities, the waste origin, ownership, characteristics and inventories. The main objective of the studies is to focus on the cost considerations, alternative funding managements and mechanisms, technical and non-technical factors that may affect the repository life-cycle costs. As a conclusion, the outcomes of thismore » paper is to make a good recommendation and could be applied to the national planners, regulatory body, engineers, or the managers, to form a financial management plan for the decommissioning of the Nuclear Installation.« less
CPTAC Assay Portal: a repository of targeted proteomic assays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.
2014-06-27
To address these issues, the Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as a public repository of well-characterized quantitative, MS-based, targeted proteomic assays. The purpose of the CPTAC Assay Portal is to facilitate widespread adoption of targeted MS assays by disseminating SOPs, reagents, and assay characterization data for highly characterized assays. A primary aim of the NCI-supported portal is to bring together clinicians or biologists and analytical chemists to answer hypothesis-driven questions using targeted, MS-based assays. Assay content is easily accessed through queries and filters, enabling investigatorsmore » to find assays to proteins relevant to their areas of interest. Detailed characterization data are available for each assay, enabling researchers to evaluate assay performance prior to launching the assay in their own laboratory.« less
A Preliminary Performance Assessment for Salt Disposal of High-Level Nuclear Waste - 12173
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Clayton, Daniel; Jove-Colon, Carlos
2012-07-01
A salt repository is one of the four geologic media currently under study by the U.S. DOE Office of Nuclear Energy to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic salt repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a salt formation. The current phase of this study considers representative geologic settings and features adopted from previous studiesmore » for salt repository sites. For the reference scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small. For the human intrusion (or disturbed) scenario, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario. Actinides including Pu-239, Pu-242 and Np-237 are major annual dose contributors, and the calculated peak mean annual dose is acceptably low. A performance assessment model for a generic salt repository has been developed incorporating, where applicable, representative geologic settings and features adopted from literature data for salt repository sites. The conceptual model and scenario for radionuclide release and transport from a salt repository were developed utilizing literature data. The salt GDS model was developed in a probabilistic analysis framework. The preliminary performance analysis for demonstration of model capability is for an isothermal condition at the ambient temperature for the near field. The capability demonstration emphasizes key attributes of a salt repository that are potentially important to the long-term safe disposal of UNF and HLW. The analysis presents and discusses the results showing repository responses to different radionuclide release scenarios (undisturbed and human intrusion). For the reference (or nominal or undisturbed) scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 (non-sorbing and unlimited solubility with a very long half-life) is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small that there is no meaningful consequence for the repository performance. For the human intrusion (or disturbed) scenario analysis, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario analysis. Compared to the reference scenario, the relative annual dose contributions by soluble, non-sorbing fission products, particularly I-129, are much lower than by actinides including Pu-239, Pu-242 and Np-237. The lower relative mean annual dose contributions by the fission product radionuclides are due to their lower total inventory available for release (i.e., up to five affected waste packages), and the higher mean annual doses by the actinides are the outcome of the direct release of the radionuclides into the overlying aquifer having high water flow rates, thereby resulting in an early arrival of higher concentrations of the radionuclides at the biosphere drinking water well prior to their significant decay. The salt GDS model analysis has also identified the following future recommendations and/or knowledge gaps to improve and enhance the confidence of the future repository performance analysis. - Repository thermal loading by UNF and HLW, and the effect on the engineered barrier and near-field performance. - Closure and consolidation of salt rocks by creep deformation under the influence of thermal perturbation, and the effect on the engineered barrier and near-field performance. - Brine migration and radionuclide transport under the influence of thermal perturbation in generic salt repository environment, and the effect on the engineered barrier and near-field performance and far-field performance. - Near-field geochemistry and radionuclide mobility in generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Degradation of engineer barrier components (waste package, waste canister, waste forms, etc.) in a generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Waste stream types and inventory estimates, particularly for reprocessing high-level waste. (authors)« less
Integrating repositories with fuel cycles: The airport authority model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forsberg, C.
2012-07-01
The organization of the fuel cycle is a legacy of World War II and the cold war. Fuel cycle facilities were developed and deployed without consideration of the waste management implications. This led to the fuel cycle model of a geological repository site with a single owner, a single function (disposal), and no other facilities on site. Recent studies indicate large economic, safety, repository performance, nonproliferation, and institutional incentives to collocate and integrate all back-end facilities. Site functions could include geological disposal of spent nuclear fuel (SNF) with the option for future retrievability, disposal of other wastes, reprocessing with fuelmore » fabrication, radioisotope production, other facilities that generate significant radioactive wastes, SNF inspection (navy and commercial), and related services such as SNF safeguards equipment testing and training. This implies a site with multiple facilities with different owners sharing some facilities and using common facilities - the repository and SNF receiving. This requires a different repository site institutional structure. We propose development of repository site authorities modeled after airport authorities. Airport authorities manage airports with government-owned runways, collocated or shared public and private airline terminals, commercial and federal military facilities, aircraft maintenance bases, and related operations - all enabled and benefiting the high-value runway asset and access to it via taxi ways. With a repository site authority the high value asset is the repository. The SNF and HLW receiving and storage facilities (equivalent to the airport terminal) serve the repository, any future reprocessing plants, and others with needs for access to SNF and other wastes. Non-public special-built roadways and on-site rail lines (equivalent to taxi ways) connect facilities. Airport authorities are typically chartered by state governments and managed by commissions with members appointed by the state governor, county governments, and city governments. This structure (1) enables state and local governments to work together to maximize job and tax benefits to local communities and the state, (2) provides a mechanism to address local concerns such as airport noise, and (3) creates an institutional structure with large incentives to maximize the value of the common asset, the runway. A repository site authority would have a similar structure and be the local interface to any national waste management authority. (authors)« less
Staudt, C; Semiochkina, N; Kaiser, J C; Pröhl, G
2013-01-01
Biosphere models are used to evaluate the exposure of populations to radionuclides from a deep geological repository. Since the time frame for assessments of long-time disposal safety is 1 million years, potential future climate changes need to be accounted for. Potential future climate conditions were defined for northern Germany according to model results from the BIOCLIM project. Nine present day reference climate regions were defined to cover those future climate conditions. A biosphere model was developed according to the BIOMASS methodology of the IAEA and model parameters were adjusted to the conditions at the reference climate regions. The model includes exposure pathways common to those reference climate regions in a stylized biosphere and relevant to the exposure of a hypothetical self-sustaining population at the site of potential radionuclide contamination from a deep geological repository. The end points of the model are Biosphere Dose Conversion factors (BDCF) for a range of radionuclides and scenarios normalized for a constant radionuclide concentration in near-surface groundwater. Model results suggest an increased exposure of in dry climate regions with a high impact of drinking water consumption rates and the amount of irrigation water used for agriculture. Copyright © 2012 Elsevier Ltd. All rights reserved.
10 CFR 60.44 - Changes, tests, and experiments.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...
10 CFR 60.44 - Changes, tests, and experiments.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...
10 CFR 60.44 - Changes, tests, and experiments.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...
10 CFR 60.44 - Changes, tests, and experiments.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...
Geomechanical Considerations for the Deep Borehole Field Test
NASA Astrophysics Data System (ADS)
Park, B. Y.
2015-12-01
Deep borehole disposal of high-level radioactive waste is under consideration as a potential alternative to shallower mined repositories. The disposal concept consists of drilling a borehole into crystalline basement rocks to a depth of 5 km, emplacement of canisters containing solid waste in the lower 2 km, and plugging and sealing the upper 3 km of the borehole. Crystalline rocks such as granites are particularly attractive for borehole emplacement because of their low permeability and porosity at depth, and high mechanical strength to resist borehole deformation. In addition, high overburden pressures contribute to sealing of some of the fractures that provide transport pathways. We present geomechanical considerations during construction (e.g., borehole breakouts, disturbed rock zone development, and creep closure), relevant to both the smaller-diameter characterization borehole (8.5") and the larger-diameter field test borehole (17"). Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Brouwer, Albert; Brown, David; Tomuta, Elena
2017-04-01
To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.
Next-Generation Search Engines for Information Retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devarakonda, Ranjeet; Hook, Leslie A; Palanisamy, Giri
In the recent years, there have been significant advancements in the areas of scientific data management and retrieval techniques, particularly in terms of standards and protocols for archiving data and metadata. Scientific data is rich, and spread across different places. In order to integrate these pieces together, a data archive and associated metadata should be generated. Data should be stored in a format that can be retrievable and more importantly it should be in a format that will continue to be accessible as technology changes, such as XML. While general-purpose search engines (such as Google or Bing) are useful formore » finding many things on the Internet, they are often of limited usefulness for locating Earth Science data relevant (for example) to a specific spatiotemporal extent. By contrast, tools that search repositories of structured metadata can locate relevant datasets with fairly high precision, but the search is limited to that particular repository. Federated searches (such as Z39.50) have been used, but can be slow and the comprehensiveness can be limited by downtime in any search partner. An alternative approach to improve comprehensiveness is for a repository to harvest metadata from other repositories, possibly with limits based on subject matter or access permissions. Searches through harvested metadata can be extremely responsive, and the search tool can be customized with semantic augmentation appropriate to the community of practice being served. One such system, Mercury, a metadata harvesting, data discovery, and access system, built for researchers to search to, share and obtain spatiotemporal data used across a range of climate and ecological sciences. Mercury is open-source toolset, backend built on Java and search capability is supported by the some popular open source search libraries such as SOLR and LUCENE. Mercury harvests the structured metadata and key data from several data providing servers around the world and builds a centralized index. The harvested files are indexed against SOLR search API consistently, so that it can render search capabilities such as simple, fielded, spatial and temporal searches across a span of projects ranging from land, atmosphere, and ocean ecology. Mercury also provides data sharing capabilities using Open Archive Initiatives Protocol for Metadata Handling (OAI-PMH). In this paper we will discuss about the best practices for archiving data and metadata, new searching techniques, efficient ways of data retrieval and information display.« less
Colloid formation during waste form reaction: Implications for nuclear waste disposal
Bates, J. K.; Bradley, J.; Teetsov, A.; Bradley, C. R.; Buchholtz ten Brink, Marilyn R.
1992-01-01
Insoluble plutonium- and americium-bearing colloidal particles formed during simulated weathering of a high-level nuclear waste glass. Nearly 100 percent of the total plutonium and americium in test ground water was concentrated in these submicrometer particles. These results indicate that models of actinide mobility and repository integrity, which assume complete solubility of actinides in ground water, underestimate the potential for radionuclide release into the environment. A colloid-trapping mechanism may be necessary for a waste repository to meet long-term performance specifications.
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Islam, A.; Lu, J.
2017-12-01
Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based monitoring technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, a site operator may identify the potential anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures ( 120psi). The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results were further analyzed by developing a 3D flow model, using which the model parameters were estimated through frequency domain inversion.
Chan, Wing Cheuk; Jackson, Gary; Wright, Craig Shawe; Orr-Walker, Brandon; Drury, Paul L; Boswell, D Ross; Lee, Mildred Ai Wei; Papa, Dean; Jackson, Rod
2014-01-01
Objectives To determine the diabetes screening levels and known glycaemic status of all individuals by age, gender and ethnicity within a defined geographic location in a timely and consistent way to potentially facilitate systematic disease prevention and management. Design Retrospective observational study. Setting Auckland region of New Zealand. Participants 1 475 347 people who had utilised publicly funded health service in New Zealand and domicile in the Auckland region of New Zealand in 2010. The health service utilisation population was individually linked to a comprehensive regional laboratory repository dating back to 2004. Outcome measures The two outcomes measures were glycaemia-related blood testing coverage (glycated haemoglobin (HbA1c), fasting and random glucose and glucose tolerance tests), and the proportions and number of people with known dysglycaemia in 2010 using modified American Diabetes Association (ADA) and WHO criteria. Results Within the health service utilisation population, 792 560 people had had at least one glucose or HbA1c blood test in the previous 5.5 years. Overall, 81% of males (n=198 086) and 87% of females (n=128 982) in the recommended age groups for diabetes screening had a blood test to assess their glycaemic status. The estimated age-standardised prevalence of dysglycaemia was highest in people of Pacific Island ethnicity at 11.4% (95% CI 11.2% to 11.5%) for males and 11.6% (11.4% to 11.8%) for females, followed closely by people of Indian ethnicity at 10.8% (10.6% to 11.1%) and 9.3% (9.1% to 9.6%), respectively. Among the indigenous Maori population, the prevalence was 8.2% (7.9% to 8.4%) and 7% (6.8% to 7.2%), while for ‘Others’ (mainly Europeans) it was 3% (3% to 3.1%) and 2.2% (2.1% to 2.2%), respectively. Conclusions We have demonstrated that the data linkage between a laboratory repository and national administrative datasets has the potential to provide a systematic and consistent individual level clinical information that is relevant to medical auditing for a large geographically defined population. PMID:24776708
Chan, Wing Cheuk; Jackson, Gary; Wright, Craig Shawe; Orr-Walker, Brandon; Drury, Paul L; Boswell, D Ross; Lee, Mildred Ai Wei; Papa, Dean; Jackson, Rod
2014-04-28
To determine the diabetes screening levels and known glycaemic status of all individuals by age, gender and ethnicity within a defined geographic location in a timely and consistent way to potentially facilitate systematic disease prevention and management. Retrospective observational study. Auckland region of New Zealand. 1 475 347 people who had utilised publicly funded health service in New Zealand and domicile in the Auckland region of New Zealand in 2010. The health service utilisation population was individually linked to a comprehensive regional laboratory repository dating back to 2004. The two outcomes measures were glycaemia-related blood testing coverage (glycated haemoglobin (HbA1c), fasting and random glucose and glucose tolerance tests), and the proportions and number of people with known dysglycaemia in 2010 using modified American Diabetes Association (ADA) and WHO criteria. Within the health service utilisation population, 792 560 people had had at least one glucose or HbA1c blood test in the previous 5.5 years. Overall, 81% of males (n=198 086) and 87% of females (n=128 982) in the recommended age groups for diabetes screening had a blood test to assess their glycaemic status. The estimated age-standardised prevalence of dysglycaemia was highest in people of Pacific Island ethnicity at 11.4% (95% CI 11.2% to 11.5%) for males and 11.6% (11.4% to 11.8%) for females, followed closely by people of Indian ethnicity at 10.8% (10.6% to 11.1%) and 9.3% (9.1% to 9.6%), respectively. Among the indigenous Maori population, the prevalence was 8.2% (7.9% to 8.4%) and 7% (6.8% to 7.2%), while for 'Others' (mainly Europeans) it was 3% (3% to 3.1%) and 2.2% (2.1% to 2.2%), respectively. We have demonstrated that the data linkage between a laboratory repository and national administrative datasets has the potential to provide a systematic and consistent individual level clinical information that is relevant to medical auditing for a large geographically defined population.
2011-03-28
particular topic of interest. Paper -based documents require the availability of a physical instance of a document, involving the transport of documents...repository of documents via the World Wide Web and search engines offer support in locating documents that are likely to contain relevant information. The... Web , with news agencies, newspapers, various organizations, and individuals as sources. Clearly the analysis, interpretation, and integration of
2016-01-01
medical service (such as obstetric delivery), or a specific technology (such as robotic surgery apparatus), is not available at the MTF. All 13...operating theaters and lack of robotic surgery capabilities at WBAMC. We found three relevant agreements in the MEDCOM central repository...civilian facilities offer was superior to what was available (and justifiable) at the MTF—notably, robotic surgery capa- bility. Not only do these
OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.
Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier
2016-04-01
To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.
Zheng, Liange; Samper, Javier; Montenegro, Luis
2011-09-25
The performance assessment of a geological repository for radioactive waste requires quantifying the geochemical evolution of the bentonite engineered barrier. This barrier will be exposed to coupled thermal (T), hydrodynamic (H), mechanical (M) and chemical (C) processes. This paper presents a coupled THC model of the FEBEX (Full-scale Engineered Barrier EXperiment) in situ test which accounts for bentonite swelling and chemical and thermal osmosis. Model results attest the relevance of thermal osmosis and bentonite swelling for the geochemical evolution of the bentonite barrier while chemical osmosis is found to be almost irrelevant. The model has been tested with data collected after the dismantling of heater 1 of the in situ test. The model reproduces reasonably well the measured temperature, relative humidity, water content and inferred geochemical data. However, it fails to mimic the solute concentrations at the heater-bentonite and bentonite-granite interfaces because the model does not account for the volume change of bentonite, the CO(2)(g) degassing and the transport of vapor from the bentonite into the granite. The inferred HCO(3)(-) and pH data cannot be explained solely by solute transport, calcite dissolution and protonation/deprotonation by surface complexation, suggesting that such data may be affected also by other reactions. Published by Elsevier B.V.
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2010-01-01 2010-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2013-01-01 2013-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2012-01-01 2012-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2014-01-01 2014-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.142 - Design testing.
Code of Federal Regulations, 2011 CFR
2011-01-01
... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2011-01-01 2011-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Liange; Rutqvist, Jonny; Xu, Hao
The focus of research within the Spent Fuel and Waste Science and Technology (SFWST) (formerly called Used Fuel Disposal) Campaign is on repository-induced interactions that may affect the key safety characteristics of EBS bentonite and an argillaceous rock. These include thermal-hydrologicalmechanical- chemical (THMC) process interactions that occur as a result of repository construction and waste emplacement. Some of the key questions addressed in this report include the development of fracturing in the excavation damaged zone (EDZ) and THMC effects on the near-field argillaceous rock and buffer materials and petrophysical characteristics, particularly the impacts of temperature rise caused by waste heat.more » This report documents the following research activities. Section 2 presents THM model developments and validation, including modeling of underground heater experiments at Mont Terri and Bure underground research laboratories (URLs). The heater experiments modeled are the Mont Terri FE (Full-scale Emplacement) Experiment, conducted as part of the Mont Terri Project, and the TED in heater test conducted in Callovo-Oxfordian claystone (COx) at the Meuse/Haute-Marne (MHM) underground research laboratory in France. The modeling of the TED heater test is one of the Tasks of the DEvelopment of COupled Models and their VAlidation against EXperiments (DECOVALEX)-2019 project. Section 3 presents the development and application of thermal-hydrological-mechanical-chemical (THMC) modeling to evaluate EBS bentonite and argillite rock responses under different temperatures (100 °C and 200 °C). Model results are presented to help to understand the impact of high temperatures on the properties and behavior of bentonite and argillite rock. Eventually the process model will support a robust GDSA model for repository performance assessments. Section 4 presents coupled THMC modeling for an in situ test conducted at Grimsel underground laboratory in Switzerland in the Full-Scale Engineered Barrier Experiment Dismantling Project (FEBEX-DP). The data collected in the test after almost two decades of heating and two dismantling events provide a unique opportunity of validating coupled THMC models and enhancing our understanding of coupled THMC process in EBS bentonite. Section 5 presents a planned large in-situ test, “HotBENT,” at Grimsel Test Site, Switzerland. In this test, bentonite backfilled EBS in granite will be heated up to 200 °C, where the most relevant features of future emplacement conditions can be adequately reproduced. Lawrence Berkeley National Laboratory (LBNL) has very actively participated in the project since the very beginning and have conducted scoping calculations in FY17 to facilitate the final design of the experiment. Section 6 presents present LBNL’s activities for modeling gas migration in clay related to Task A of the international DECOVALEX-2019 project. This is an international collaborative activity in which DOE and LBNL gain access to unique laboratory and field data of gas migration that are studied with numerical modeling to better understand the processes, to improve numerical models that could eventually be applied in the performance assessment for nuclear waste disposal in clay host rocks and bentonite backfill. Section 7 summarizes the main research accomplishments for FY17 and proposes future work activities.« less
COMODI: an ontology to characterise differences in versions of computational models in biology.
Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf
2016-07-11
Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .
Long-Term Information Management (LTIM) of Safeguards Data at Repositories: Phase II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haddal, Risa N.
One of the challenges of implementing safeguards for geological repositories will be the long-term preservation of safeguards-related data for 100 years or more. While most countries considering the construction and operation of such facilities agree that safeguards information should be preserved, there are gaps with respect to standardized requirements, guidelines, timescales, and approaches. This study analyzes those gaps and explores research to clarify stakeholder needs, identify current policies, approaches, best practices and international standards, and explores existing safeguards information management infrastructure. The study also attempts to clarify what a safeguards data classification system might look like, how long data shouldmore » be retained, and how information should be exchanged between stakeholders at different phases of a repository’s life cycle. The analysis produced a variety of recommendations on what information to preserve, how to preserve it, where to store it, retention options and how to exchange information in the long term. Key findings include the use of the globally recognized international records management standard, ISO15489, for guidance on the development of information management systems, and the development of a Key Information File (KIF). The KIF could be used to identify only the most relevant, high-level safeguards information and the history of decision making about the repository. The study also suggests implementing on-site and off-site records storage in digital and physical form; developing a safeguards data classification system; long-term records retention with periodic reviews every 5 to 10 years during each phase of the repository life cycle; and establishing transition procedures well in advance so that data shepherds and records officers can transfer information with incoming facility managers effectively and efficiently. These and other recommendations are further analyzed in this study.« less
Caballero, Carla; Mistry, Sejal; Vero, Joe; Torres, Elizabeth B
2018-01-01
The variability inherently present in biophysical data is partly contributed by disparate sampling resolutions across instrumentations. This poses a potential problem for statistical inference using pooled data in open access repositories. Such repositories combine data collected from multiple research sites using variable sampling resolutions. One example is the Autism Brain Imaging Data Exchange repository containing thousands of imaging and demographic records from participants in the spectrum of autism and age-matched neurotypical controls. Further, statistical analyses of groups from different diagnoses and demographics may be challenging, owing to the disparate number of participants across different clinical subgroups. In this paper, we examine the noise signatures of head motion data extracted from resting state fMRI data harnessed under different sampling resolutions. We characterize the quality of the noise in the variability of the raw linear and angular speeds for different clinical phenotypes in relation to age-matched controls. Further, we use bootstrapping methods to ensure compatible group sizes for statistical comparison and report the ranges of physical involuntary head excursions of these groups. We conclude that different sampling rates do affect the quality of noise in the variability of head motion data and, consequently, the type of random process appropriate to characterize the time series data. Further, given a qualitative range of noise, from pink to brown noise, it is possible to characterize different clinical subtypes and distinguish them in relation to ranges of neurotypical controls. These results may be of relevance to the pre-processing stages of the pipeline of analyses of resting state fMRI data, whereby head motion enters the criteria to clean imaging data from motion artifacts. PMID:29556179
Störmer, M; Arroyo, A; Brachert, J; Carrero, H; Devine, D; Epstein, J S; Gabriel, C; Gelber, C; Goodrich, R; Hanschmann, K-M; Heath, D G; Jacobs, M R; Keil, S; de Korte, D; Lambrecht, B; Lee, C-K; Marcelis, J; Marschner, S; McDonald, C; McGuane, S; McKee, M; Müller, T H; Muthivhi, T; Pettersson, A; Radziwon, P; Ramirez-Arcos, S; Reesink, H W; Rojo, J; Rood, I; Schmidt, M; Schneider, C K; Seifried, E; Sicker, U; Wendel, S; Wood, E M; Yomtovian, R A; Montag, T
2012-01-01
Bacterial contamination of platelet concentrates (PCs) still remains a significant problem in transfusion with potential important clinical consequences, including death. The International Society of Blood Transfusion Working Party on Transfusion-Transmitted Infectious Diseases, Subgroup on Bacteria, organised an international study on Transfusion-Relevant Bacteria References to be used as a tool for development, validation and comparison of both bacterial screening and pathogen reduction methods. Four Bacteria References (Staphylococcus epidermidis PEI-B-06, Streptococcus pyogenes PEI-B-20, Klebsiella pneumoniae PEI-B-08 and Escherichia coli PEI-B-19) were selected regarding their ability to proliferate to high counts in PCs and distributed anonymised to 14 laboratories in 10 countries for identification, enumeration and bacterial proliferation in PCs after low spiking (0·3 and 0·03 CFU/ml), to simulate contamination occurring during blood donation. Bacteria References were correctly identified in 98% of all 52 identifications. S. pyogenes and E. coli grew in PCs in 11 out of 12 laboratories, and K. pneumoniae and S. epidermidis replicated in all participating laboratories. The results of bacterial counts were very consistent between laboratories: the 95% confidence intervals were for S. epidermidis: 1·19-1·32 × 10(7) CFU/ml, S. pyogenes: 0·58-0·69 × 10(7) CFU/ml, K. pneumoniae: 18·71-20·26 × 10(7) CFU/ml and E. coli: 1·78-2·10 × 10(7) CFU/ml. The study was undertaken as a proof of principle with the aim to demonstrate (i) the quality, stability and suitability of the bacterial strains for low-titre spiking of blood components, (ii) the property of donor-independent proliferation in PCs, and (iii) their suitability for worldwide shipping of deep frozen, blinded pathogenic bacteria. These aims were successfully fulfilled. The WHO Expert Committee Biological Standardisation has approved the adoption of these four bacteria strains as the first Repository for Transfusion-Relevant Bacteria Reference Strains and, additionally, endorsed as a project the addition of six further bacteria strain preparations suitable for control of platelet contamination as the next step of enlargement of the repository. © 2011 The Author(s). Vox Sanguinis © 2011 International Society of Blood Transfusion.
Citing a Data Repository: A Case Study of the Protein Data Bank
Huang, Yi-Hung; Rose, Peter W.; Hsu, Chun-Nan
2015-01-01
The Protein Data Bank (PDB) is the worldwide repository of 3D structures of proteins, nucleic acids and complex assemblies. The PDB’s large corpus of data (> 100,000 structures) and related citations provide a well-organized and extensive test set for developing and understanding data citation and access metrics. In this paper, we present a systematic investigation of how authors cite PDB as a data repository. We describe a novel metric based on information cascade constructed by exploring the citation network to measure influence between competing works and apply that to analyze different data citation practices to PDB. Based on this new metric, we found that the original publication of RCSB PDB in the year 2000 continues to attract most citations though many follow-up updates were published. None of these follow-up publications by members of the wwPDB organization can compete with the original publication in terms of citations and influence. Meanwhile, authors increasingly choose to use URLs of PDB in the text instead of citing PDB papers, leading to disruption of the growth of the literature citations. A comparison of data usage statistics and paper citations shows that PDB Web access is highly correlated with URL mentions in the text. The results reveal the trend of how authors cite a biomedical data repository and may provide useful insight of how to measure the impact of a data repository. PMID:26317409
Citing a Data Repository: A Case Study of the Protein Data Bank.
Huang, Yi-Hung; Rose, Peter W; Hsu, Chun-Nan
2015-01-01
The Protein Data Bank (PDB) is the worldwide repository of 3D structures of proteins, nucleic acids and complex assemblies. The PDB's large corpus of data (> 100,000 structures) and related citations provide a well-organized and extensive test set for developing and understanding data citation and access metrics. In this paper, we present a systematic investigation of how authors cite PDB as a data repository. We describe a novel metric based on information cascade constructed by exploring the citation network to measure influence between competing works and apply that to analyze different data citation practices to PDB. Based on this new metric, we found that the original publication of RCSB PDB in the year 2000 continues to attract most citations though many follow-up updates were published. None of these follow-up publications by members of the wwPDB organization can compete with the original publication in terms of citations and influence. Meanwhile, authors increasingly choose to use URLs of PDB in the text instead of citing PDB papers, leading to disruption of the growth of the literature citations. A comparison of data usage statistics and paper citations shows that PDB Web access is highly correlated with URL mentions in the text. The results reveal the trend of how authors cite a biomedical data repository and may provide useful insight of how to measure the impact of a data repository.
Theory and Modelling Resources Cookbook
NASA Astrophysics Data System (ADS)
Gray, Norman
This cookbook is intended to assemble references to resources likely to be of interest to theorists and modellers. It's not a collection of standard recipes, but instead a repository of brief introductions to facilities. It includes references to sources of authoritative information, including those Starlink documents most likely to be of interest to theorists. Although the topics are chosen for their relevance to theoretical work, a good proportion of the information should be of interest to all of the astronomical computing community.
An evaluation of the process and initial impact of disseminating a nursing e-thesis.
Macduff, Colin
2009-05-01
This paper is a report of a study conducted to evaluate product, process and outcome aspects of the dissemination of a nursing PhD thesis via an open-access electronic institutional repository. Despite the growth of university institutional repositories which make theses easily accessible via the world wide web, nursing has been very slow to evaluate related processes and outcomes. Drawing on Stake's evaluation research methods, a case study design was adopted. The case is described using a four-phase structure within which key aspects of process and impact are reflexively analysed. In the conceptualization/re-conceptualization phase, fundamental questions about the purpose, format and imagined readership for a published nursing PhD were considered. In the preparation phase, seven key practical processes were identified that are likely to be relevant to most e-theses. In the dissemination phase email invitations were primarily used to invite engagement. The evaluation phase involved quantitative indicators of initial impact, such as page viewing and download statistics and qualitative feedback on processes and product. Analysis of process and impact elements of e-thesis dissemination is likely to have more than intrinsic value. The advent of e-theses housed in web-based institutional repositories has the potential to transform thesis access and use. It also offers potential to transform the nature and scope of thesis production and dissemination. Nursing scholars can exploit and evaluate such opportunities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meike, A.; Stroes-Gascoyne, S.
2000-08-01
A workshop on Microbial Activities at Yucca Mountain (May 1995, Lafayette, CA) was held with the intention to compile information on all pertinent aspects of microbial activity for application to a potential repository at Yucca Mountain. The findings of this workshop set off a number of efforts intended to eventually incorporate the impacts of microbial behavior into performance assessment models. One effort was to expand an existing modeling approach to include the distinctive characteristics of a repository at Yucca Mountain (e.g., unsaturated conditions and a significant thermal load). At the same time, a number of experimental studies were initiated asmore » well as a compilation of relevant literature to more thoroughly study the physical, chemical and biological parameters that would affect microbial activity under Yucca Mountain-like conditions. This literature search (completed in 1996) is the subject of the present document. The collected literature can be divided into four categories: (1) abiotic factors, (2) community dynamics and in-situ considerations, (3) nutrient considerations and (4) transport of radionuclides. The complete bibliography represents a considerable resource, but is too large to be discussed in one document. Therefore, the present report focuses on the first category, abiotic factors, and a discussion of these factors in order to facilitate the development of a model for Yucca Mountain.« less
The SKI repository performance assessment project Site-94
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersson, J.; Dverstorp, B.; Sjoeblom, R.
1995-12-01
SITE-94 is a research project conducted as a performance assessment of a hypothetical repository for spent nuclear fuel, but with real pre-excavation data from a real site. The geosphere, the engineered barriers and the processes for radionuclide release and transport comprise an integrated interdependent system, which is described by an influence diagram (PID) that reflects how different Features, Events or Processes (FEPs) inside the system interact. Site evaluation is used to determine information of transport paths in the geosphere and to deliver information on geosphere interaction with the engineered barriers. A three-dimensional geological structure model of the site as wellmore » as alternative conceptual models consistent with the existing hydrological field data, have been analyzed. Groundwater chemistry is evaluated and a model, fairly consistent with the flow model, for the origin of the different waters has been developed. The geological structure model is also used for analyzing the mechanical stability of the site. Several phenomena of relevance for copper corrosion in a repository environment have been investigated. For Reference Case conditions and regardless of flow variability, output is dominated by I-129, which, for a single canister, may give rise to drinking water well doses in the order of 10{sup -6}Sv/yr. Finally, it appears that the procedures involved in the development of influence diagrams may be a promising tool for quality assurance of performance assessments.« less
3-D printing provides a novel approach for standardization and reproducibility of freezing devices
Hu, E; Childress, William; Tiersch, Terrence R.
2017-01-01
Cryopreservation has become an important and accepted tool for long-term germplasm conservation of animals and plants. To protect genetic resources, repositories have been developed with national and international cooperation. For a repository to be effective, the genetic material submitted must be of good quality and comparable to other submissions. However, due to a variety of reasons, including constraints in knowledge and available resources, cryopreservation methods for aquatic species vary widely across user groups which reduces reproducibility and weakens quality control. Herein we describe a standardizable freezing device produced using 3-dimensional (3-D) printing and introduce the concept of network sharing to achieve aggregate high-throughput cryopreservation for aquatic species. The objectives were to: 1) adapt widely available polystyrene foam products that would be inexpensive, portable, and provide adequate work space; 2) develop a design suitable for 3-D printing that could provide multiple configurations, be inexpensive, and easy to use, and 3) evaluate various configurations to attain freezing rates suitable for various common cryopreservation containers. Through this approach, identical components can be accessed globally, and we demonstrated that 3-D printers can be used to fabricate parts for standardizable freezing devices yielding relevant and reproducible cooling rates across users. With standardized devices for freezing, methods and samples can harmonize into an aggregated high-throughput pathway not currently available for aquatic species repository development. PMID:28465185
Data2Paper: A stakeholder-driven solution to data publication and citation challenges
NASA Astrophysics Data System (ADS)
Murphy, Fiona; Jefferies, Neil; Ingraham, Thomas; Murray, Hollydawn; Ranganathan, Anusha
2017-04-01
Data, and especially open data, are valuable to the community but can also be valuable to the researcher. Data papers are a clear and open way to publicize and contextualize your data in a way that is citable and aids both reproducibility and efficiency in scholarly endeavour. However, this is not yet a format that is well understood or proliferating amongst the mainstream research community. Part of the Jisc Data Spring Initiative, a team of stakeholders (publishers, data repository managers, coders) have been developing a simple 'one-click' process where data, metadata and methods detail are transferred from a data repository (via a SWORD-based API and a cloud-based helper app based on the Fedora/Hydra platform) to a relevant publisher platform for publication as a data paper. Relying on automated processes: using ORCIDs to authenticate and pre-populate article templates and building on the DOI infrastructure to encourage provenance and citation, the app seeks to drive the deposit of data in repositories and encourage the growth of data papers by simplifying the process through the removal of redundant metadata entry and streamlining publisher submissions into a single consistent workflow. This poster will explain the underlying rationale and evidence gathering, development, partnerships, governance and other progress that this project has so far achieved. It will outline some key learning opportunities, challenges and drivers and explore the next steps.
Linking Big and Small Data Across the Social, Engineering, and Earth Sciences
NASA Astrophysics Data System (ADS)
Chen, R. S.; de Sherbinin, A. M.; Levy, M. A.; Downs, R. R.
2014-12-01
The challenges of sustainable development cut across the social, health, ecological, engineering, and Earth sciences, across a wide range of spatial and temporal scales, and across the spectrum from basic to applied research and decision making. The rapidly increasing availability of data and information in digital form from a variety of data repositories, networks, and other sources provides new opportunities to link and integrate both traditional data holdings as well as emerging "big data" resources in ways that enable interdisciplinary research and facilitate the use of objective scientific data and information in society. Taking advantage of these opportunities not only requires improved technical and scientific data interoperability across disciplines, scales, and data types, but also concerted efforts to bridge gaps and barriers between key communities, institutions, and networks. Given the long time perspectives required in planning sustainable approaches to development, it is also imperative to address user requirements for long-term data continuity and stewardship by trustworthy repositories. We report here on lessons learned by CIESIN working on a range of sustainable development issues to integrate data across multiple repositories and networks. This includes CIESIN's roles in developing policy-relevant climate and environmental indicators, soil data for African agriculture, and exposure and risk measures for hazards, disease, and conflict, as well as CIESIN's participation in a range of national and international initiatives related both to sustainable development and to open data access, interoperability, and stewardship.
Simmons, Ardyth M.; Stuckless, John S.; with a Foreword by Abraham Van Luik, U.S. Department of Energy
2010-01-01
Natural analogues are defined for this report as naturally occurring or anthropogenic systems in which processes similar to those expected to occur in a nuclear waste repository are thought to have taken place over time periods of decades to millennia and on spatial scales as much as tens of kilometers. Analogues provide an important temporal and spatial dimension that cannot be tested by laboratory or field-scale experiments. Analogues provide one of the multiple lines of evidence intended to increase confidence in the safe geologic disposal of high-level radioactive waste. Although the work in this report was completed specifically for Yucca Mountain, Nevada, as the proposed geologic repository for high-level radioactive waste under the U.S. Nuclear Waste Policy Act, the applicability of the science, analyses, and interpretations is not limited to a specific site. Natural and anthropogenic analogues have provided and can continue to provide value in understanding features and processes of importance across a wide variety of topics in addressing the challenges of geologic isolation of radioactive waste and also as a contribution to scientific investigations unrelated to waste disposal. Isolation of radioactive waste at a mined geologic repository would be through a combination of natural features and engineered barriers. In this report we examine analogues to many of the various components of the Yucca Mountain system, including the preservation of materials in unsaturated environments, flow of water through unsaturated volcanic tuff, seepage into repository drifts, repository drift stability, stability and alteration of waste forms and components of the engineered barrier system, and transport of radionuclides through unsaturated and saturated rock zones.
Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results
NASA Astrophysics Data System (ADS)
Nussbaum, C. O.; Bossart, P. J.
2012-12-01
Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.
Kautsky, Ulrik; Lindborg, Tobias; Valentin, Jack
2013-05-01
This is an overview of the strategy used to describe the effects of a potential release from a radioactive waste repository on human exposure and future environments. It introduces a special issue of AMBIO, in which 13 articles show ways of understanding and characterizing the future. The study relies mainly on research performed in the context of a recent safety report concerning a repository for spent nuclear fuel in Sweden (the so-called SR-Site project). The development of a good understanding of on-site processes and acquisition of site-specific data facilitated the development of new approaches for assessment of surface ecosystems. A systematic and scientifically coherent methodology utilizes the understanding of the current spatial and temporal dynamics as an analog for future conditions. We conclude that future ecosystem can be inferred from a few variables and that this multidisciplinary approach is relevant in a much wider context than radioactive waste.
SBR-Blood: systems biology repository for hematopoietic cells.
Lichtenberg, Jens; Heuston, Elisabeth F; Mishra, Tejaswini; Keller, Cheryl A; Hardison, Ross C; Bodine, David M
2016-01-04
Extensive research into hematopoiesis (the development of blood cells) over several decades has generated large sets of expression and epigenetic profiles in multiple human and mouse blood cell types. However, there is no single location to analyze how gene regulatory processes lead to different mature blood cells. We have developed a new database framework called hematopoietic Systems Biology Repository (SBR-Blood), available online at http://sbrblood.nhgri.nih.gov, which allows user-initiated analyses for cell type correlations or gene-specific behavior during differentiation using publicly available datasets for array- and sequencing-based platforms from mouse hematopoietic cells. SBR-Blood organizes information by both cell identity and by hematopoietic lineage. The validity and usability of SBR-Blood has been established through the reproduction of workflows relevant to expression data, DNA methylation, histone modifications and transcription factor occupancy profiles. Published by Oxford University Press on behalf of Nucleic Acids Research 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Energy Dissipation in Calico Hills Tuff due to Pore Collapse
NASA Astrophysics Data System (ADS)
Lockner, D. A.; Morrow, C. A.
2008-12-01
Laboratory tests indicate that the weakest portions of the Calico Hills tuff formation are at or near yield stress under in situ conditions and that the energy expended during incremental loading can be more than 90 percent irrecoverable. The Calico Hills tuff underlies the Yucca Mountain waste repository site at a depth of 400 to 500 m within the unsaturated zone. The formation is highly variable in the degree of both vitrification and zeolitization. Since 1980, a number of boreholes penetrated this formation to provide site characterization for the YM repository. In the past, standard strength measurements were conducted on core samples from the drillholes. However, a significant sampling bias occurred in that tests were preferentially conducted on highly vitrified, higher-strength samples. In fact, the most recent holes were drilled with a dry coring technique that would pulverize the weakest layers, leaving none of this material for testing. We have re-examined Calico Hills samples preserved at the YM Core Facility and selected the least vitrified examples (some cores exceeded 50 percent porosity) for mechanical testing. Three basic tests were performed: (i) hydrostatic crushing tests (to 350 MPa), (ii) standard triaxial deformation tests at constant effective confining pressure (to 70 MPa), and (iii) plane strain tests with initial conditions similar to in situ stresses. In all cases, constant pore pressure of 10 MPa was maintained using argon gas as a pore fluid and pore volume loss was monitored during deformation. The strongest samples typically failed along discrete fractures in agreement with standard Mohr-Coulomb failure. The weaker, high porosity samples, however, would fail by pure pore collapse or by a combined shear-induced compaction mechanism similar to failure mechanisms described for porous sandstones and carbonates. In the plane-strain experiments, energy dissipation due to pore collapse was determined for eventual input into dynamic wave calculations. These calculations will simulate ground accelerations at the YM repository due to propagation of high-amplitude compressional waves generated by scenario earthquakes. As an example, in one typical test on a sample with 43 percent starting porosity, an axial stress increase of 25 MPa resulted from 6 percent shortening and energy dissipation (due to grain crushing and pore collapse) of approximately 1.5x106 J/m3. Under proper conditions, this dissipation mechanism could represent a significant absorption of radiated seismic energy and the possible shielding of the repository from extreme ground shaking.
Meystre, Stephane; Gouripeddi, Ramkiran; Tieder, Joel; Simmons, Jeffrey; Srivastava, Rajendu; Shah, Samir
2017-05-15
Community-acquired pneumonia is a leading cause of pediatric morbidity. Administrative data are often used to conduct comparative effectiveness research (CER) with sufficient sample sizes to enhance detection of important outcomes. However, such studies are prone to misclassification errors because of the variable accuracy of discharge diagnosis codes. The aim of this study was to develop an automated, scalable, and accurate method to determine the presence or absence of pneumonia in children using chest imaging reports. The multi-institutional PHIS+ clinical repository was developed to support pediatric CER by expanding an administrative database of children's hospitals with detailed clinical data. To develop a scalable approach to find patients with bacterial pneumonia more accurately, we developed a Natural Language Processing (NLP) application to extract relevant information from chest diagnostic imaging reports. Domain experts established a reference standard by manually annotating 282 reports to train and then test the NLP application. Findings of pleural effusion, pulmonary infiltrate, and pneumonia were automatically extracted from the reports and then used to automatically classify whether a report was consistent with bacterial pneumonia. Compared with the annotated diagnostic imaging reports reference standard, the most accurate implementation of machine learning algorithms in our NLP application allowed extracting relevant findings with a sensitivity of .939 and a positive predictive value of .925. It allowed classifying reports with a sensitivity of .71, a positive predictive value of .86, and a specificity of .962. When compared with each of the domain experts manually annotating these reports, the NLP application allowed for significantly higher sensitivity (.71 vs .527) and similar positive predictive value and specificity . NLP-based pneumonia information extraction of pediatric diagnostic imaging reports performed better than domain experts in this pilot study. NLP is an efficient method to extract information from a large collection of imaging reports to facilitate CER. ©Stephane Meystre, Ramkiran Gouripeddi, Joel Tieder, Jeffrey Simmons, Rajendu Srivastava, Samir Shah. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umari, A.M.J.; Geldon, A.; Patterson, G.
1994-12-31
Yucca Mountain, Nevada, currently is being investigated by the U.S. Geological Survey as a potential site for a high-level nuclear waste repository. Planned hydraulic-stress and tracer tests in fractured, tuffaceous rocks below the water table at Yucca Mountain will require work at depths in excess of 1,300 feet. To facilitate prototype testing of equipment and methods to be used in aquifer tests at Yucca Mountain, an analog site was selected in the foothills of the Sierra Nevada near Raymond, California. Two of nine 250- to 300-feet deep wells drilled into fractured, granitic rocks at the Raymond site have been instrumentedmore » with packers, pressure transducers, and other equipment that will be used at Yucca Mountain. Aquifer tests conducted at the Raymond site to date have demonstrated a need to modify some of the equipment and methods conceived for use at Yucca Mountain.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tietze-Jaensch, Holger; Schneider, Stephan; Aksyutina, Yuliya
2012-07-01
The German product quality control is inter alia responsible for control of two radioactive waste forms of heat generating waste: a) homogeneous vitrified HLW and b) heterogeneous compacted hulls, end-pieces and technological metallic waste. In either case, significantly different metrology is employed at the site of the conditioning plant for the obligatory nuclide inventory declaration. To facilitate an independent evaluation and checking of the accompanying documentation numerical simulations are carried out. The physical and chemical properties of radioactive waste residues are used to assess the data consistency and uncertainty margins, as well as to predict the long-term behavior of themore » radioactive waste. This is relevant for repository acceptance and safety considerations. Our new numerical approach follows a bottom-up simulation starting from the burn-up behavior of the fuel elements in the reactor core. The output of these burn-up calculations is then coupled with a program that simulates the material separation in the subsequent dissolution and extraction processes normalized to the mass balance. Follow-up simulations of the separated reprocessing lines of a) the vitrification of highly-active liquid and b) the compaction of residual intermediate-active metallic hulls remaining after fuel pellets dissolution, end-pieces and technological waste, allows calculating expectation values for the various repository relevant properties of either waste stream. The principles of the German product quality control of radioactive waste residues from the spent fuel reprocessing have been introduced and explained. Namely, heat generating homogeneous vitrified HLW and heterogeneous compacted metallic MLW have been discussed. The advantages of a complementary numerical property simulation have been made clear and examples of benefits are presented. We have compiled a new program suite to calculate the physical and radio-chemical properties of common nuclear waste residues. The immediate benefit is the independent assessment of radio-active inventory declarations and much facilitated product quality control of waste residues that need to be returned to Germany and submitted to a German HLW-repository requirements. Wherever possible, internationally accepted standard programs are used and embedded. The innovative coupling of burn-up calculations (SCALE) with neutron and gamma transport codes (MCPN-X) allows an application in the world of virtual waste properties. If-then-else scenarios of hypothetical waste material compositions and distributions provide valuable information of long term nuclide property propagation under repository conditions over a very long time span. Benchmarking the program with real residue data demonstrates the power and remarkable accuracy of this numerical approach, boosting the reliability of the confidence aforementioned numerous applications, namely the proof tool set for on-the-spot production quality checking and data evaluation and independent verification. Moreover, using the numerical bottom-up approach helps to avoid the accumulation of fake activities that may gradually build up in a repository from the so-called conservative or penalizing nuclide inventory declarations. The radioactive waste properties and the hydrolytic and chemical stability can be predicted. The interaction with invasive chemicals can be assessed and propagation scenarios can be developed from reliable and sound data and HLW properties. Hence, the appropriate design of a future HLW repository can be based upon predictable and quality assured waste characteristics. (authors)« less
NASA Technical Reports Server (NTRS)
Milroy, Audrey; Hale, Joe
2006-01-01
NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.
Control of stacking loads in final waste disposal according to the borehole technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feuser, W.; Barnert, E.; Vijgen, H.
1996-12-01
The semihydrostatic model has been developed in order to assess the mechanical toads acting on heat-generating ILW(Q) and HTGR fuel element waste packages to be emplaced in vertical boreholes according to the borehole technique in underground rock salt formations. For the experimental validation of the theory, laboratory test stands reduced in scale are set up to simulate the bottom section of a repository borehole. A comparison of the measurement results with the data computed by the model, a correlation between the test stand results, and a systematic determination of material-typical crushed salt parameters in a separate research project will servemore » to derive a set of characteristic equations enabling a description of real conditions in a future repository.« less
SINGLE HEATER TEST FINAL REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.B. Cho
The Single Heater Test is the first of the in-situ thermal tests conducted by the U.S. Department of Energy as part of its program of characterizing Yucca Mountain in Nevada as the potential site for a proposed deep geologic repository for the disposal of spent nuclear fuel and high-level nuclear waste. The Site Characterization Plan (DOE 1988) contained an extensive plan of in-situ thermal tests aimed at understanding specific aspects of the response of the local rock-mass around the potential repository to the heat from the radioactive decay of the emplaced waste. With the refocusing of the Site Characterization Planmore » by the ''Civilian Radioactive Waste Management Program Plan'' (DOE 1994), a consolidated thermal testing program emerged by 1995 as documented in the reports ''In-Situ Thermal Testing Program Strategy'' (DOE 1995) and ''Updated In-Situ Thermal Testing Program Strategy'' (CRWMS M&O 1997a). The concept of the Single Heater Test took shape in the summer of 1995 and detailed planning and design of the test started with the beginning fiscal year 1996. The overall objective of the Single Heater Test was to gain an understanding of the coupled thermal, mechanical, hydrological, and chemical processes that are anticipated to occur in the local rock-mass in the potential repository as a result of heat from radioactive decay of the emplaced waste. This included making a priori predictions of the test results using existing models and subsequently refining or modifying the models, on the basis of comparative and interpretive analyses of the measurements and predictions. A second, no less important, objective was to try out, in a full-scale field setting, the various instruments and equipment to be employed in the future on a much larger, more complex, thermal test of longer duration, such as the Drift Scale Test. This ''shake down'' or trial aspect of the Single Heater Test applied not just to the hardware, but also to the teamwork and cooperation between multiple organizations performing their part in the test.« less
Solar Sail Propulsion Technology Readiness Level Database
NASA Technical Reports Server (NTRS)
Adams, Charles L.
2004-01-01
The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).
A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features.
Amudha, P; Karthik, S; Sivakumari, S
2015-01-01
Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup'99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different.
A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features
Amudha, P.; Karthik, S.; Sivakumari, S.
2015-01-01
Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup'99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different. PMID:26221625
A Python library for FAIRer access and deposition to the Metabolomics Workbench Data Repository.
Smelter, Andrey; Moseley, Hunter N B
2018-01-01
The Metabolomics Workbench Data Repository is a public repository of mass spectrometry and nuclear magnetic resonance data and metadata derived from a wide variety of metabolomics studies. The data and metadata for each study is deposited, stored, and accessed via files in the domain-specific 'mwTab' flat file format. In order to improve the accessibility, reusability, and interoperability of the data and metadata stored in 'mwTab' formatted files, we implemented a Python library and package. This Python package, named 'mwtab', is a parser for the domain-specific 'mwTab' flat file format, which provides facilities for reading, accessing, and writing 'mwTab' formatted files. Furthermore, the package provides facilities to validate both the format and required metadata elements of a given 'mwTab' formatted file. In order to develop the 'mwtab' package we used the official 'mwTab' format specification. We used Git version control along with Python unit-testing framework as well as continuous integration service to run those tests on multiple versions of Python. Package documentation was developed using sphinx documentation generator. The 'mwtab' package provides both Python programmatic library interfaces and command-line interfaces for reading, writing, and validating 'mwTab' formatted files. Data and associated metadata are stored within Python dictionary- and list-based data structures, enabling straightforward, 'pythonic' access and manipulation of data and metadata. Also, the package provides facilities to convert 'mwTab' files into a JSON formatted equivalent, enabling easy reusability of the data by all modern programming languages that implement JSON parsers. The 'mwtab' package implements its metadata validation functionality based on a pre-defined JSON schema that can be easily specialized for specific types of metabolomics studies. The library also provides a command-line interface for interconversion between 'mwTab' and JSONized formats in raw text and a variety of compressed binary file formats. The 'mwtab' package is an easy-to-use Python package that provides FAIRer utilization of the Metabolomics Workbench Data Repository. The source code is freely available on GitHub and via the Python Package Index. Documentation includes a 'User Guide', 'Tutorial', and 'API Reference'. The GitHub repository also provides 'mwtab' package unit-tests via a continuous integration service.
Grubb, Stephen C.; Maddatu, Terry P.; Bult, Carol J.; Bogue, Molly A.
2009-01-01
The Mouse Phenome Database (MPD; http://www.jax.org/phenome) is an open source, web-based repository of phenotypic and genotypic data on commonly used and genetically diverse inbred strains of mice and their derivatives. MPD is also a facility for query, analysis and in silico hypothesis testing. Currently MPD contains about 1400 phenotypic measurements contributed by research teams worldwide, including phenotypes relevant to human health such as cancer susceptibility, aging, obesity, susceptibility to infectious diseases, atherosclerosis, blood disorders and neurosensory disorders. Electronic access to centralized strain data enables investigators to select optimal strains for many systems-based research applications, including physiological studies, drug and toxicology testing, modeling disease processes and complex trait analysis. The ability to select strains for specific research applications by accessing existing phenotype data can bypass the need to (re)characterize strains, precluding major investments of time and resources. This functionality, in turn, accelerates research and leverages existing community resources. Since our last NAR reporting in 2007, MPD has added more community-contributed data covering more phenotypic domains and implemented several new tools and features, including a new interactive Tool Demo available through the MPD homepage (quick link: http://phenome.jax.org/phenome/trytools). PMID:18987003
Imbalanced target prediction with pattern discovery on clinical data repositories.
Chan, Tak-Ming; Li, Yuxi; Chiau, Choo-Chiap; Zhu, Jane; Jiang, Jie; Huo, Yong
2017-04-20
Clinical data repositories (CDR) have great potential to improve outcome prediction and risk modeling. However, most clinical studies require careful study design, dedicated data collection efforts, and sophisticated modeling techniques before a hypothesis can be tested. We aim to bridge this gap, so that clinical domain users can perform first-hand prediction on existing repository data without complicated handling, and obtain insightful patterns of imbalanced targets for a formal study before it is conducted. We specifically target for interpretability for domain users where the model can be conveniently explained and applied in clinical practice. We propose an interpretable pattern model which is noise (missing) tolerant for practice data. To address the challenge of imbalanced targets of interest in clinical research, e.g., deaths less than a few percent, the geometric mean of sensitivity and specificity (G-mean) optimization criterion is employed, with which a simple but effective heuristic algorithm is developed. We compared pattern discovery to clinically interpretable methods on two retrospective clinical datasets. They contain 14.9% deaths in 1 year in the thoracic dataset and 9.1% deaths in the cardiac dataset, respectively. In spite of the imbalance challenge shown on other methods, pattern discovery consistently shows competitive cross-validated prediction performance. Compared to logistic regression, Naïve Bayes, and decision tree, pattern discovery achieves statistically significant (p-values < 0.01, Wilcoxon signed rank test) favorable averaged testing G-means and F1-scores (harmonic mean of precision and sensitivity). Without requiring sophisticated technical processing of data and tweaking, the prediction performance of pattern discovery is consistently comparable to the best achievable performance. Pattern discovery has demonstrated to be robust and valuable for target prediction on existing clinical data repositories with imbalance and noise. The prediction results and interpretable patterns can provide insights in an agile and inexpensive way for the potential formal studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
George, James T.; Sobolik, Steven R.; Lee, Moo Y.
The study described in this report involves heated and unheated pressurized slot testing to determine thermo-mechanical properties of the Tptpll (Tertiary, Paintbrush, Topopah Spring Tuff Formation, crystal poor, lower lithophysal) and Tptpul (upper lithophysal) lithostratigraphic units at Yucca Mountain, Nevada. A large volume fraction of the proposed repository at Yucca Mountain may reside in the Tptpll lithostratigraphic unit. This unit is characterized by voids, or lithophysae, which range in size from centimeters to meters, making a field program an effective method of measuring bulk thermal-mechanical rock properties (thermal expansion, rock mass modulus, compressive strength, time-dependent deformation) over a range ofmore » temperature and rock conditions. The field tests outlined in this report provide data for the determination of thermo-mechanical properties of this unit. Rock-mass response data collected during this field test will reduce the uncertainty in key thermal-mechanical modeling parameters (rock-mass modulus, strength and thermal expansion) for the Tptpll lithostratigraphic unit, and provide a basis for understanding thermal-mechanical behavior of this unit. The measurements will be used to evaluate numerical models of the thermal-mechanical response of the repository. These numerical models are then used to predict pre- and post-closure repository response. ACKNOWLEDGEMENTS The authors would like to thank David Bronowski, Ronnie Taylor, Ray E. Finley, Cliff Howard, Michael Schuhen (all SNL) and Fred Homuth (LANL) for their work in the planning and implementation of the tests described in this report. This is a reprint of SAND2004-2703, which was originally printed in July 2004. At that time, it was printed for a restricted audience. It has now been approved for unlimited release.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, L.; Samper, J.; Montenegro, L.
The performance assessment of a geological repository for radioactive waste requires quantifying the geochemical evolution of the bentonite engineered barrier. This barrier will be exposed to coupled thermal (T), hydrodynamic (H), mechanical (M) and chemical (C) processes. This paper presents a coupled THC model of the FEBEX (Full-scale Engineered Barrier EXperiment) in situ test which accounts for bentonite swelling and chemical and thermal osmosis. Model results attest the relevance of thermal osmosis and bentonite swelling for the geochemical evolution of the bentonite barrier while chemical osmosis is found to be almost irrelevant. The model has been tested with data collectedmore » after the dismantling of heater 1 of the in situ test. The model reproduces reasonably well the measured temperature, relative humidity, water content and inferred geochemical data. However, it fails to mimic the solute concentrations at the heater-bentonite and bentonite-granite interfaces because the model does not account for the volume change of bentonite, the CO{sub 2}(g) degassing and the transport of vapor from the bentonite into the granite. The inferred HCO{sub 3}{sup -} and pH data cannot be explained solely by solute transport, calcite dissolution and protonation/deprotonation by surface complexation, suggesting that such data may be affected also by other reactions.« less
NASA Astrophysics Data System (ADS)
Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen
Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.
AN OPEN-SOURCE COMMUNITY WEB SITE TO SUPPORT GROUND-WATER MODEL TESTING
A community wiki wiki web site has been created as a resource to support ground-water model development and testing. The Groundwater Gourmet wiki is a repository for user supplied analytical and numerical recipes, how-to's, and examples. Members are encouraged to submit analyti...
10 CFR 60.131 - General design criteria for the geologic repository operations area.
Code of Federal Regulations, 2012 CFR
2012-01-01
... operating systems, including alarm systems, important to safety. (g) Inspection, testing, and maintenance... radioactivity areas; and (6) A radiation alarm system to warn of significant increases in radiation levels... system shall be designed with provisions for calibration and for testing its operability. (b) Protection...
10 CFR 60.131 - General design criteria for the geologic repository operations area.
Code of Federal Regulations, 2014 CFR
2014-01-01
... operating systems, including alarm systems, important to safety. (g) Inspection, testing, and maintenance... radioactivity areas; and (6) A radiation alarm system to warn of significant increases in radiation levels... system shall be designed with provisions for calibration and for testing its operability. (b) Protection...
10 CFR 60.131 - General design criteria for the geologic repository operations area.
Code of Federal Regulations, 2011 CFR
2011-01-01
... operating systems, including alarm systems, important to safety. (g) Inspection, testing, and maintenance... radioactivity areas; and (6) A radiation alarm system to warn of significant increases in radiation levels... system shall be designed with provisions for calibration and for testing its operability. (b) Protection...
10 CFR 60.131 - General design criteria for the geologic repository operations area.
Code of Federal Regulations, 2013 CFR
2013-01-01
... operating systems, including alarm systems, important to safety. (g) Inspection, testing, and maintenance... radioactivity areas; and (6) A radiation alarm system to warn of significant increases in radiation levels... system shall be designed with provisions for calibration and for testing its operability. (b) Protection...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cotte, F.P.; Doughty, C.; Birkholzer, J.
2010-11-01
The ability to reliably predict flow and transport in fractured porous rock is an essential condition for performance evaluation of geologic (underground) nuclear waste repositories. In this report, a suite of programs (TRIPOLY code) for calculating and analyzing flow and transport in two-dimensional fracture-matrix systems is used to model single-well injection-withdrawal (SWIW) tracer tests. The SWIW test, a tracer test using one well, is proposed as a useful means of collecting data for site characterization, as well as estimating parameters relevant to tracer diffusion and sorption. After some specific code adaptations, we numerically generated a complex fracture-matrix system for computationmore » of steady-state flow and tracer advection and dispersion in the fracture network, along with solute exchange processes between the fractures and the porous matrix. We then conducted simulations for a hypothetical but workable SWIW test design and completed parameter sensitivity studies on three physical parameters of the rock matrix - namely porosity, diffusion coefficient, and retardation coefficient - in order to investigate their impact on the fracture-matrix solute exchange process. Hydraulic fracturing, or hydrofracking, is also modeled in this study, in two different ways: (1) by increasing the hydraulic aperture for flow in existing fractures and (2) by adding a new set of fractures to the field. The results of all these different tests are analyzed by studying the population of matrix blocks, the tracer spatial distribution, and the breakthrough curves (BTCs) obtained, while performing mass-balance checks and being careful to avoid some numerical mistakes that could occur. This study clearly demonstrates the importance of matrix effects in the solute transport process, with the sensitivity studies illustrating the increased importance of the matrix in providing a retardation mechanism for radionuclides as matrix porosity, diffusion coefficient, or retardation coefficient increase. Interestingly, model results before and after hydrofracking are insensitive to adding more fractures, while slightly more sensitive to aperture increase, making SWIW tests a possible means of discriminating between these two potential hydrofracking effects. Finally, we investigate the possibility of inferring relevant information regarding the fracture-matrix system physical parameters from the BTCs obtained during SWIW testing.« less
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...
40 CFR 124.33 - Information repository.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Information repository. 124.33 Section... FOR DECISIONMAKING Specific Procedures Applicable to RCRA Permits § 124.33 Information repository. (a... basis, for an information repository. When assessing the need for an information repository, the...
10 CFR 60.130 - General considerations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.130 General... for a high-level radioactive waste repository at a geologic repository operations area, and an... geologic repository operations area, must include the principal design criteria for a proposed facility...
Godah, Mohammad W; Abdul Khalek, Rima A; Kilzar, Lama; Zeid, Hiba; Nahlawi, Acile; Lopes, Luciane Cruz; Darzi, Andrea J; Schünemann, Holger J; Akl, Elie A
2016-12-01
Low- and middle-income countries adapt World Health Organization (WHO) guidelines instead of de novo development for financial, epidemiologic, sociopolitical, cultural, organizational, and other reasons. To systematically evaluate reported processes used in the adaptation of WHO guidelines for human immunodeficiency virus (HIV) and tuberculosis (TB). We searched three online databases/repositories: United States Agency for International Development (USAID) AIDS Support and Technical Resources - Sector One program (AIDSTAR-One) National Treatment Database; the AIDSspace Guideline Repository, and WHO Database of national HIV and TB guidelines. We assessed the rigor and quality of reported adaptation methodology using the ADAPTE process as benchmark. Of 170 eligible guidelines, only 32 (19%) reported documentation on the adaptation process. The median and interquartile range of the number of ADAPTE steps fulfilled by the eligible guidelines were 11.5 (10, 13.5) (out of 23 steps). The number of guidelines (out of 32 steps) fulfilling each ADAPTE step was 18 (interquartile range, 5-27). Seventeen of 32 guidelines (53%) met all steps relevant to the setup phase, whereas none met all steps relevant to the adaptation phase. The number of well-documented adaptation methodologies in national HIV and/or TB guidelines is very low. There is a need for the use of standardized and systematic framework for guideline adaptation and improved reporting of processes used. Copyright © 2016 Elsevier Inc. All rights reserved.
48 CFR 227.7108 - Contractor data repositories.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Technical Data 227.7108 Contractor data repositories. (a) Contractor data repositories may be established... procedures for protecting technical data delivered to or stored at the repository from unauthorized release... disclosure of technical data from the repository to third parties consistent with the Government's rights in...
Coupled Heat and Moisture Transport Simulation on the Re-saturation of Engineered Clay Barrier
NASA Astrophysics Data System (ADS)
Huang, W. H.; Chuang, Y. F.
2014-12-01
Engineered clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation processes of clay barrier, with emphasis on the coupling effects of heat and moisture during the intrusion of groundwater to the repository. A reference bentonite and a locally available clay were adopted in the laboratory program. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures so as to determine the soil water characteristic curves of the two clays at different temperatures. And water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the clay barrier. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. It was found that soil suction decreases as temperature increases, resulting in a reduction in water retention capability. The finite element method was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on the clays. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. Finally, the model was then used to evaluate the effect of clay barrier thickness on the time required for groundwater to penetrate the clay barrier and approach saturation. Due to the variation in clay suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.
Development of the performance confirmation program at YUCCA mountain, nevada
LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.
2006-01-01
The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.
Duda, Stephanie; Fahim, Christine; Szatmari, Peter; Bennett, Kathryn
2017-07-01
Innovative strategies that facilitate the use of high quality practice guidelines (PG) are needed. Accordingly, repositories designed to simplify access to PGs have been proposed as a critical component of the network of linked interventions needed to drive increased PG implementation. The National Guideline Clearinghouse (NGC) is a free, international online repository. We investigated whether it is a trustworthy source of child and youth anxiety and depression PGs. English language PGs published between January 2009 and February 2016 relevant to anxiety or depression in children and adolescents (≤ 18 years of age) were eligible. Two trained raters assessed PG quality using Appraisal of Guidelines for Research and Evaluation (AGREE II). Scores on at least three AGREE II domains (stakeholder involvement, rigor of development, and editorial independence) were used to designate PGs as: i) minimum quality (≥ 50%); and ii) high quality (≥ 70%). Eight eligible PGs were identified (depression, n=6; anxiety and depression, n=1; social anxiety disorder, n=1). Four of eight PGs met minimum quality criteria; three of four met high quality criteria. At present, NGC users without the time and special skills required to evaluate PG quality may unknowingly choose flawed PGs to guide decisions about child and youth anxiety and depression. The recent NGC decision to explore the inclusion of PG quality profiles based on Institute of Medicine standards provides needed leadership that can strengthen PG repositories, prevent harm and wasted resources, and build PG developer capacity.
Database Resources of the BIG Data Center in 2018
Xu, Xingjian; Hao, Lili; Zhu, Junwei; Tang, Bixia; Zhou, Qing; Song, Fuhai; Chen, Tingting; Zhang, Sisi; Dong, Lili; Lan, Li; Wang, Yanqing; Sang, Jian; Hao, Lili; Liang, Fang; Cao, Jiabao; Liu, Fang; Liu, Lin; Wang, Fan; Ma, Yingke; Xu, Xingjian; Zhang, Lijuan; Chen, Meili; Tian, Dongmei; Li, Cuiping; Dong, Lili; Du, Zhenglin; Yuan, Na; Zeng, Jingyao; Zhang, Zhewen; Wang, Jinyue; Shi, Shuo; Zhang, Yadong; Pan, Mengyu; Tang, Bixia; Zou, Dong; Song, Shuhui; Sang, Jian; Xia, Lin; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Zhang, Yang; Sheng, Xin; Lu, Mingming; Wang, Qi; Xiao, Jingfa; Zou, Dong; Wang, Fan; Hao, Lili; Liang, Fang; Li, Mengwei; Sun, Shixiang; Zou, Dong; Li, Rujiao; Yu, Chunlei; Wang, Guangyu; Sang, Jian; Liu, Lin; Li, Mengwei; Li, Man; Niu, Guangyi; Cao, Jiabao; Sun, Shixiang; Xia, Lin; Yin, Hongyan; Zou, Dong; Xu, Xingjian; Ma, Lina; Chen, Huanxin; Sun, Yubin; Yu, Lei; Zhai, Shuang; Sun, Mingyuan; Zhang, Zhang; Zhao, Wenming; Xiao, Jingfa; Bao, Yiming; Song, Shuhui; Hao, Lili; Li, Rujiao; Ma, Lina; Sang, Jian; Wang, Yanqing; Tang, Bixia; Zou, Dong; Wang, Fan
2018-01-01
Abstract The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. PMID:29036542
Wu, Huiqun; Wei, Yufang; Shang, Yujuan; Shi, Wei; Wang, Lei; Li, Jingjing; Sang, Aimin; Shi, Lili; Jiang, Kui; Dong, Jiancheng
2018-06-06
Type 2 diabetes mellitus (T2DM) is a common chronic disease, and the fragment data collected through separated vendors makes continuous management of DM patients difficult. The lack of standard of fragment data from those diabetic patients also makes the further potential phenotyping based on the diabetic data difficult. Traditional T2DM data repository only supports data collection from T2DM patients, lack of phenotyping ability and relied on standalone database design, limiting the secondary usage of these valuable data. To solve these issues, we proposed a novel T2DM data repository framework, which was based on standards. This repository can integrate data from various sources. It would be used as a standardized record for further data transfer as well as integration. Phenotyping was conducted based on clinical guidelines with KNIME workflow. To evaluate the phenotyping performance of the proposed system, data was collected from local community by healthcare providers and was then tested using algorithms. The results indicated that the proposed system could detect DR cases with an average accuracy of about 82.8%. Furthermore, these results had the promising potential of addressing fragmented data. The proposed system has integrating and phenotyping abilities, which could be used for diabetes research in future studies.
Review of waste package verification tests. Semiannual report, October 1982-March 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soo, P.
1983-08-01
The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.
17 CFR 49.12 - Swap data repository recordkeeping requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...
17 CFR 49.12 - Swap data repository recordkeeping requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...
17 CFR 49.12 - Swap data repository recordkeeping requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Swap data repository... COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of...
Code of Federal Regulations, 2010 CFR
2010-01-01
... geologic repository operations area. 63.112 Section 63.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical... repository operations area. The preclosure safety analysis of the geologic repository operations area must...
Managing and Evaluating Digital Repositories
ERIC Educational Resources Information Center
Zuccala, Alesia; Oppenheim, Charles; Dhiensa, Rajveen
2008-01-01
Introduction: We examine the role of the digital repository manager, discuss the future of repository management and evaluation and suggest that library and information science schools develop new repository management curricula. Method: Face-to-face interviews were carried out with managers of five different types of repositories and a Web-based…
jPOSTrepo: an international standard data repository for proteomes
Okuda, Shujiro; Watanabe, Yu; Moriya, Yuki; Kawano, Shin; Yamamoto, Tadashi; Matsumoto, Masaki; Takami, Tomoyo; Kobayashi, Daiki; Araki, Norie; Yoshizawa, Akiyasu C.; Tabata, Tsuyoshi; Sugiyama, Naoyuki; Goto, Susumu; Ishihama, Yasushi
2017-01-01
Major advancements have recently been made in mass spectrometry-based proteomics, yielding an increasing number of datasets from various proteomics projects worldwide. In order to facilitate the sharing and reuse of promising datasets, it is important to construct appropriate, high-quality public data repositories. jPOSTrepo (https://repository.jpostdb.org/) has successfully implemented several unique features, including high-speed file uploading, flexible file management and easy-to-use interfaces. This repository has been launched as a public repository containing various proteomic datasets and is available for researchers worldwide. In addition, our repository has joined the ProteomeXchange consortium, which includes the most popular public repositories such as PRIDE in Europe for MS/MS datasets and PASSEL for SRM datasets in the USA. Later MassIVE was introduced in the USA and accepted into the ProteomeXchange, as was our repository in July 2016, providing important datasets from Asia/Oceania. Accordingly, this repository thus contributes to a global alliance to share and store all datasets from a wide variety of proteomics experiments. Thus, the repository is expected to become a major repository, particularly for data collected in the Asia/Oceania region. PMID:27899654
Independent assessment of candidate HIV incidence assays on specimens in the CEPHIA repository
Kassanjee, Reshma; Pilcher, Christopher D.; Keating, Sheila M.; Facente, Shelley N.; McKinney, Elaine; Price, Matthew A.; Martin, Jeffrey N.; Little, Susan; Hecht, Frederick M.; Kallas, Esper G.; Welte, Alex; Busch, Michael P.; Murphy, Gary
2014-01-01
Objective: Cross-sectional HIV incidence surveillance, using assays that distinguish ‘recent’ from ‘nonrecent’ infections, has been hampered by inadequate performance and characterization of incidence assays. In this study, the Consortium for the Evaluation and Performance of HIV Incidence Assays presents results of the first independent evaluation of five incidence assays (BED, Limiting Antigen Avidity, Less-sensitive Vitros, Vitros Avidity and BioRad Avidity). Design: A large repository of diverse specimens from HIV-positive patients was established, multiple assays were run on 2500 selected specimens, and data were analyzed to estimate assay characteristics relevant for incidence surveillance. Methods: The mean duration of recent infection (MDRI, average time ‘recent’ while infected for less than some time cut-off T) was estimated from longitudinal data on seroconverters by regression. The false-recent rate (FRR, probability of testing ‘recent’ when infected for longer than T) was explored by measuring the proportions of ‘recent’ results in various subsets of patients. Results: Assays continue to fail to attain the simultaneously large MDRI and small FRR demanded by existing performance guidelines. All assays produce high FRRs amongst virally suppressed patients (>40%), including elite controllers and treated patients. Conclusions: Results from this first independent evaluation provide valuable information about the current performance of assays, and suggest the need for further optimization. Variation of ‘recent’/‘nonrecent’ thresholds and the use of multiple antibody-maturation assays, as well as other biomarkers, can now be explored, using the rich data generated by the Consortium for the Evaluation and Performance of HIV Incidence Assays. Consistently high FRRs amongst those virally suppressed suggest that viral load will be a particularly valuable supplementary marker. Video abstract: PMID:25144218
Titler, Marita G; Jensen, Gwenneth A; Dochterman, Joanne McCloskey; Xie, Xian-Jin; Kanak, Mary; Reed, David; Shever, Leah L
2008-04-01
To determine the impact of patient characteristics, clinical conditions, hospital unit characteristics, and health care interventions on hospital cost of patients with heart failure. Data for this study were part of a larger study that used electronic clinical data repositories from an 843-bed, academic medical center in the Midwest. This retrospective, exploratory study used existing administrative and clinical data from 1,435 hospitalizations of 1,075 patients 60 years of age or older. A cost model was tested using generalized estimating equations (GEE) analysis. Electronic databases used in this study were the medical record abstract, the financial data repository, the pharmacy repository; and the Nursing Information System repository. Data repositories were merged at the patient level into a relational database and housed on an SQL server. The model accounted for 88 percent of the variability in hospital costs for heart failure patients 60 years of age and older. The majority of variables that were associated with hospital cost were provider interventions. Each medical procedure increased cost by $623, each unique medication increased cost by $179, and the addition of each nursing intervention increased cost by $289. One medication and several nursing interventions were associated with lower cost. Nurse staffing below the average and residing on 2-4 units increased hospital cost. The model and data analysis techniques used here provide an innovative and useful methodology to describe and quantify significant health care processes and their impact on cost per hospitalization. The findings indicate the importance of conducting research using existing clinical data in health care.
Eagle-i: Making Invisible Resources, Visible
Haendel, M.; Wilson, M.; Torniai, C.; Segerdell, E.; Shaffer, C.; Frost, R.; Bourges, D.; Brownstein, J.; McInnerney, K.
2010-01-01
RP-134 The eagle-i Consortium – Dartmouth College, Harvard Medical School, Jackson State University, Morehouse School of Medicine, Montana State University, Oregon Health and Science University (OHSU), the University of Alaska, the University of Hawaii, and the University of Puerto Rico – aims to make invisible resources for scientific research visible by developing a searchable network of resource repositories at research institutions nationwide. Now in early development, it is hoped that the system will scale beyond the consortium at the end of the two-year pilot. Data Model & Ontology: The eagle-i ontology development team at the OHSU Library is generating the data model and ontologies necessary for resource indexing and querying. Our indexing system will enable cores and research labs to represent resources within a defined vocabulary, leading to more effective searches and better linkage between data types. This effort is being guided by active discussions within the ontology community (http://RRontology.tk) bringing together relevant preexisting ontologies in a logical framework. The goal of these discussions is to provide context for interoperability and domain-wide standards for resource types used throughout biomedical research. Research community feedback is welcomed. Architecture Development, led by a team at Harvard, includes four main components: tools for data collection, management and curation; an institutional resource repository; a federated network; and a central search application. Each participating institution will populate and manage their repository locally, using data collection and curation tools. To help improve search performance, data tools will support the semi-automatic annotation of resources. A central search application will use a federated protocol to broadcast queries to all repositories and display aggregated results. The search application will leverage the eagle-i ontologies to help guide users to valid queries via auto-suggestions and taxonomy browsing and improve search result quality via concept-based search and synonym expansion. Website: http://eagle-i.org. NIH/NCRR ARRA award #U24RR029825
Quinto, Francesca; Blechschmidt, Ingo; Garcia Perez, Carmen; Geckeis, Horst; Geyer, Frank; Golser, Robin; Huber, Florian; Lagos, Markus; Lanyon, Bill; Plaschke, Markus; Steier, Peter; Schäfer, Thorsten
2017-07-05
The multiactinide analysis with accelerator mass spectrometry (AMS) was applied to samples collected from the run 13-05 of the Colloid Formation and Migration (CFM) experiment at the Grimsel Test Site (GTS). In this in situ radionuclide tracer test, the environmental behavior of 233 U, 237 Np, 242 Pu, and 243 Am was investigated in a water conductive shear zone under conditions relevant for a nuclear waste repository in crystalline rock. The concentration of the actinides in the GTS groundwater was determined with AMS over 6 orders of magnitude from ∼15 pg/g down to ∼25 ag/g. Levels above 10 fg/g were investigated with both sector field inductively coupled plasma mass spectrometry (SF-ICPMS) and AMS. Agreement within a relative uncertainty of 50% was found for 237 Np, 242 Pu, and 243 Am concentrations determined with the two analytical methods. With the extreme sensitivity of AMS, the long-term release and retention of the actinides was investigated over 8 months in the tailing of the breakthrough curve of run 13-05 as well as in samples collected up to 22 months after. Furthermore, the evidence of masses 241 and 244 u in the CFM samples most probably representing 241 Am and 244 Pu employed in a previous tracer test demonstrated the analytical capability of AMS for in situ studies lasting more than a decade.
Best practices for fungal germplasm repositories and perspectives on their implementation.
Wiest, Aric; Schnittker, Robert; Plamann, Mike; McCluskey, Kevin
2012-02-01
In over 50 years, the Fungal Genetics Stock Center has grown to become a world-recognized biological resource center. Along with this growth comes the development and implementation of myriad practices for the management and curation of a diverse collection of filamentous fungi, yeast, and molecular genetic tools for working with the fungi. These practices include techniques for the testing, manipulation, and preservation of individual fungal isolates as well as for processing of thousands of isolates in parallel. In addition to providing accurate record keeping, an electronic managements system allows the observation of trends in strain distribution and in sample characteristics. Because many ex situ fungal germplasm repositories around the world share similar objectives, best-practice guidelines have been developed by a number of organizations such as the Organization for Economic Cooperation and Development or the International Society for Biological and Environmental Repositories. These best-practice guidelines provide a framework for the successful operation of collections and promote the development and interactions of biological resource centers around the world.
Long-term retrievability and safeguards for immobilized weapons plutonium in geologic storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, P.F.
1996-05-01
If plutonium is not ultimately used as an energy source, the quantity of excess weapons plutonium (w-Pu) that would go into a US repository will be small compared to the quantity of plutonium contained in the commercial spent fuel in the repository, and the US repository(ies) will likely be only one (or two) locations out of many around the world where commercial spent fuel will be stored. Therefore excess weapons plutonium creates a small perturbation to the long-term (over 200,000 yr) global safeguard requirements for spent fuel. There are details in the differences between spent fuel and immobilized w-Pu wastemore » forms (i.e. chemical separation methods, utility for weapons, nuclear testing requirements), but these are sufficiently small to be unlikely to play a significant role in any US political decision to rebuild weapons inventories, or to change the long-term risks of theft by subnational groups.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leigh, Christi D.; Hansen, Francis D.
This report summarizes the state of salt repository science, reviews many of the technical issues pertaining to disposal of heat-generating nuclear waste in salt, and proposes several avenues for future science-based activities to further the technical basis for disposal in salt. There are extensive salt formations in the forty-eight contiguous states, and many of them may be worthy of consideration for nuclear waste disposal. The United States has extensive experience in salt repository sciences, including an operating facility for disposal of transuranic wastes. The scientific background for salt disposal including laboratory and field tests at ambient and elevated temperature, principlesmore » of salt behavior, potential for fracture damage and its mitigation, seal systems, chemical conditions, advanced modeling capabilities and near-future developments, performance assessment processes, and international collaboration are all discussed. The discussion of salt disposal issues is brought current, including a summary of recent international workshops dedicated to high-level waste disposal in salt. Lessons learned from Sandia National Laboratories' experience on the Waste Isolation Pilot Plant and the Yucca Mountain Project as well as related salt experience with the Strategic Petroleum Reserve are applied in this assessment. Disposal of heat-generating nuclear waste in a suitable salt formation is attractive because the material is essentially impermeable, self-sealing, and thermally conductive. Conditions are chemically beneficial, and a significant experience base exists in understanding this environment. Within the period of institutional control, overburden pressure will seal fractures and provide a repository setting that limits radionuclide movement. A salt repository could potentially achieve total containment, with no releases to the environment in undisturbed scenarios for as long as the region is geologically stable. Much of the experience gained from United States repository development, such as seal system design, coupled process simulation, and application of performance assessment methodology, helps define a clear strategy for a heat-generating nuclear waste repository in salt.« less
Preservation of Earth Science Data History with Digital Content Repository Technology
NASA Astrophysics Data System (ADS)
Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.
2011-12-01
An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.
The Front-End to Google for Teachers' Online Searching
ERIC Educational Resources Information Center
Seyedarabi, Faezeh
2006-01-01
This paper reports on an ongoing work in designing and developing a personalised search tool for teachers' online searching using Google search engine (repository) for the implementation and testing of the first research prototype.
DoSSiER: Database of scientific simulation and experimental results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Hans; Yarba, Julia; Genser, Krzystof
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
DoSSiER: Database of scientific simulation and experimental results
Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...
2016-08-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
Virtual patient repositories--a comparative analysis.
Küfner, Julia; Kononowicz, Andrzej A; Hege, Inga
2014-01-01
Virtual Patients (VPs) are an important component of medical education. One way to reduce the costs for creating VPs is sharing through repositories. We conducted a literature review to identify existing repositories and analyzed the 17 included repositories in regards to the search functions and metadata they provide. Most repositories provided some metadata such as title or description, whereas other data, such as educational objectives, were less frequent. Future research could, in cooperation with the repository provider, investigate user expectations and usage patterns.
ERIC Educational Resources Information Center
Lehman, Rosemary
2007-01-01
This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)
Earthdata Search: Scaling, Assessing and Improving Relevancy
NASA Technical Reports Server (NTRS)
Reese, Mark
2016-01-01
NASA's Earthdata Search (https:search.earthdata.nasa.gov) application allows users to search, discover, visualize, and access NASA and international interagency data about the Earth. As a client to NASA's Common Metadata Repository (CMR), its catalog of data collections grew 700 in late 2015. This massive expansion brought improved search and discovery to the forefront of the client's usability needs. During this talk, we will give a brief overview of the application, the challenges that arose during this period of growth, the metrics-driven way we addressed them, and the latest outcomes.
International Collaboration Activities on Engineered Barrier Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jove-Colon, Carlos F.
The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less
NASA Astrophysics Data System (ADS)
Munoz-Jaramillo, Andres
2016-05-01
The arrival of a highly interconnected digital age with practically limitless data storage capacity has brought with it a significant shift in which scientific data is stored and distributed (i.e. from being in the hands of a small group of scientists to being openly and freely distributed for anyone to use). However, the vertiginous speed at which hardware, software, and the nature of the internet changes has also sped up the rate at which data is lost due to formatting obsolescence and loss of access.This poster is meant to advertise the creation of a highly permanent data repository (within the context of Harvard's Dataverse), curated to contain datasets of high relevance for the study, and prediction of the solar dynamo, solar cycle, and long-term solar variability. This repository has many advantages over traditional data storage like the assignment of unique DOI identifiers for each database (making it easier for scientist to directly cite them), and the automatic versioning of each database so that all data are able to attain salvation.
Integration of immunological aspects in the European Human Embryonic Stem Cell Registry.
Borstlap, Joeri; Kurtz, Andreas
2008-05-01
The immunological properties of stem cells are of increasing importance in regenerative medicine. Immunomodulatory mechanisms seem to play an important role not only with respect to the understanding of underlying mechanisms of autologous versus allogenic therapeutic approaches, but also for endogeneous tissue regeneration. The newly established European human embryonic stem cell registry (hESCreg) offers an international database for the registration, documentation and characterisation of human embryonic stem cells (hESC) and their use. By doing so, hESCreg aims to develop a model procedure for further standardisation efforts in the field of stem cell research and regenerative medicine, and eventually the registry may lead to a repository of therapy-related information. Currently the stem cell characterisation data acquired by the registry are divided into several categories such as cell derivation, culture conditions, genetic constitution, stem cell marker expression and degree of modification. This article describes immunological aspects of stem cell characterisation and explores the layout and relevance of a possible additional section to the hESCreg repository to include immunological characteristics of human embryonic stem cells.
Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas
2016-09-19
Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Haeryong; Lee, Eunyong; Jeong, YiYeong
Korea Radioactive-waste Management Corporation (KRMC) established in 2009 has started a new project to collect information on long-term stability of deep geological environments on the Korean Peninsula. The information has been built up in the integrated natural barrier database system available on web (www.deepgeodisposal.kr). The database system also includes socially and economically important information, such as land use, mining area, natural conservation area, population density, and industrial complex, because some of this information is used as exclusionary criteria during the site selection process for a deep geological repository for safe and secure containment and isolation of spent nuclear fuel andmore » other long-lived radioactive waste in Korea. Although the official site selection process has not been started yet in Korea, current integrated natural barrier database system and socio-economic database is believed that the database system will be effectively utilized to narrow down the number of sites where future investigation is most promising in the site selection process for a deep geological repository and to enhance public acceptance by providing readily-available relevant scientific information on deep geological environments in Korea. (authors)« less
Bottomley, Steven; Denny, Paul
2011-01-01
A participatory learning approach, combined with both a traditional and a competitive assessment, was used to motivate students and promote a deep approach to learning biochemistry. Students were challenged to research, author, and explain their own multiple-choice questions (MCQs). They were also required to answer, evaluate, and discuss MCQs written by their peers. The technology used to support this activity was PeerWise--a freely available, innovative web-based system that supports students in the creation of an annotated question repository. In this case study, we describe students' contributions to, and perceptions of, the PeerWise system for a cohort of 107 second-year biomedical science students from three degree streams studying a core biochemistry subject. Our study suggests that the students are eager participants and produce a large repository of relevant, good quality MCQs. In addition, they rate the PeerWise system highly and use higher order thinking skills while taking an active role in their learning. We also discuss potential issues and future work using PeerWise for biomedical students. Copyright © 2011 Wiley Periodicals, Inc.
Development of a user-centered radiology teaching file system
NASA Astrophysics Data System (ADS)
dos Santos, Marcelo; Fujino, Asa
2011-03-01
Learning radiology requires systematic and comprehensive study of a large knowledge base of medical images. In this work is presented the development of a digital radiology teaching file system. The proposed system has been created in order to offer a set of customized services regarding to users' contexts and their informational needs. This has been done by means of an electronic infrastructure that provides easy and integrated access to all relevant patient data at the time of image interpretation, so that radiologists and researchers can examine all available data to reach well-informed conclusions, while protecting patient data privacy and security. The system is presented such as an environment which implements a distributed clinical database, including medical images, authoring tools, repository for multimedia documents, and also a peer-reviewed model which assures dataset quality. The current implementation has shown that creating clinical data repositories on networked computer environments points to be a good solution in terms of providing means to review information management practices in electronic environments and to create customized and contextbased tools for users connected to the system throughout electronic interfaces.
Transportation plan repository and archive.
DOT National Transportation Integrated Search
2011-04-01
This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...
Foroushani, Amir B.K.; Brinkman, Fiona S.L.
2013-01-01
Motivation. Predominant pathway analysis approaches treat pathways as collections of individual genes and consider all pathway members as equally informative. As a result, at times spurious and misleading pathways are inappropriately identified as statistically significant, solely due to components that they share with the more relevant pathways. Results. We introduce the concept of Pathway Gene-Pair Signatures (Pathway-GPS) as pairs of genes that, as a combination, are specific to a single pathway. We devised and implemented a novel approach to pathway analysis, Signature Over-representation Analysis (SIGORA), which focuses on the statistically significant enrichment of Pathway-GPS in a user-specified gene list of interest. In a comparative evaluation of several published datasets, SIGORA outperformed traditional methods by delivering biologically more plausible and relevant results. Availability. An efficient implementation of SIGORA, as an R package with precompiled GPS data for several human and mouse pathway repositories is available for download from http://sigora.googlecode.com/svn/. PMID:24432194
NASA Astrophysics Data System (ADS)
de La Vaissière, Rémi; Armand, Gilles; Talandier, Jean
2015-02-01
The Excavation Damaged Zone (EDZ) surrounding a drift, and in particular its evolution, is being studied for the performance assessment of a radioactive waste underground repository. A specific experiment (called CDZ) was designed and implemented in the Meuse/Haute-Marne Underground Research Laboratory (URL) in France to investigate the EDZ. This experiment is dedicated to study the evolution of the EDZ hydrogeological properties (conductivity and specific storage) of the Callovo-Oxfordian claystone under mechanical compression and artificial hydration. Firstly, a loading cycle applied on a drift wall was performed to simulate the compression effect from bentonite swelling in a repository drift (bentonite is a clay material to be used to seal drifts and shafts for repository closure purpose). Gas tests (permeability tests with nitrogen and tracer tests with helium) were conducted during the first phase of the experiment. The results showed that the fracture network within the EDZ was initially interconnected and opened for gas flow (particularly along the drift) and then progressively closed with the increasing mechanical stress applied on the drift wall. Moreover, the evolution of the EDZ after unloading indicated a self-sealing process. Secondly, the remaining fracture network was resaturated to demonstrate the ability to self-seal of the COx claystone without mechanical loading by conducting from 11 to 15 repetitive hydraulic tests with monitoring of the hydraulic parameters. During this hydration process, the EDZ effective transmissivity dropped due to the swelling of the clay materials near the fracture network. The hydraulic conductivity evolution was relatively fast during the first few days. Low conductivities ranging at 10-10 m/s were observed after four months. Conversely, the specific storage showed an erratic evolution during the first phase of hydration (up to 60 days). Some uncertainty remains on this parameter due to volumetric strain during the sealing of the fractures. The hydration was stopped after one year and cross-hole hydraulic tests were performed to determine more accurately the specific storage as well as the hydraulic conductivity at a meter-scale. All hydraulic conductivity values measured at the injection interval and at the observation intervals were all below 10-10 m/s. Moreover, the preferential inter-connectivity along the drift disappeared. Specific storage values at the observation and injection intervals were similar. Furthermore they were in agreement with the value obtained at the injection interval within the second hydration phase (60 days after starting hydration). The graphical abstract synthesizes the evolution of the hydraulic/gas conductivity for 8 intervals since the beginning of the CDZ experiment. The conductivity limit of 10-10 m/s corresponds to the lower bound hydraulic definition of the EDZ and it is demonstrated that EDZ can be sealed. This is a significant result in the demonstration of the long-term safety of a repository.
Supporting multiple domains in a single reuse repository
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
Domain analysis typically results in the construction of a domain-specific repository. Such a repository imposes artificial boundaries on the sharing of similar assets between related domains. A lattice-based approach to repository modeling can preserve a reuser's domain specific view of the repository, while avoiding replication of commonly used assets and supporting a more general perspective on domain interrelationships.
NCI Mouse Repository | FNLCR Staging
The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains
The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier
NASA Technical Reports Server (NTRS)
Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.
2017-01-01
The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.
A web-based data-querying tool based on ontology-driven methodology and flowchart-based model.
Ping, Xiao-Ou; Chung, Yufang; Tseng, Yi-Ju; Liang, Ja-Der; Yang, Pei-Ming; Huang, Guan-Tarn; Lai, Feipei
2013-10-08
Because of the increased adoption rate of electronic medical record (EMR) systems, more health care records have been increasingly accumulating in clinical data repositories. Therefore, querying the data stored in these repositories is crucial for retrieving the knowledge from such large volumes of clinical data. The aim of this study is to develop a Web-based approach for enriching the capabilities of the data-querying system along the three following considerations: (1) the interface design used for query formulation, (2) the representation of query results, and (3) the models used for formulating query criteria. The Guideline Interchange Format version 3.5 (GLIF3.5), an ontology-driven clinical guideline representation language, was used for formulating the query tasks based on the GLIF3.5 flowchart in the Protégé environment. The flowchart-based data-querying model (FBDQM) query execution engine was developed and implemented for executing queries and presenting the results through a visual and graphical interface. To examine a broad variety of patient data, the clinical data generator was implemented to automatically generate the clinical data in the repository, and the generated data, thereby, were employed to evaluate the system. The accuracy and time performance of the system for three medical query tasks relevant to liver cancer were evaluated based on the clinical data generator in the experiments with varying numbers of patients. In this study, a prototype system was developed to test the feasibility of applying a methodology for building a query execution engine using FBDQMs by formulating query tasks using the existing GLIF. The FBDQM-based query execution engine was used to successfully retrieve the clinical data based on the query tasks formatted using the GLIF3.5 in the experiments with varying numbers of patients. The accuracy of the three queries (ie, "degree of liver damage," "degree of liver damage when applying a mutually exclusive setting," and "treatments for liver cancer") was 100% for all four experiments (10 patients, 100 patients, 1000 patients, and 10,000 patients). Among the three measured query phases, (1) structured query language operations, (2) criteria verification, and (3) other, the first two had the longest execution time. The ontology-driven FBDQM-based approach enriched the capabilities of the data-querying system. The adoption of the GLIF3.5 increased the potential for interoperability, shareability, and reusability of the query tasks.
Generation of openEHR Test Datasets for Benchmarking.
El Helou, Samar; Karvonen, Tuukka; Yamamoto, Goshiro; Kume, Naoto; Kobayashi, Shinji; Kondo, Eiji; Hiragi, Shusuke; Okamoto, Kazuya; Tamura, Hiroshi; Kuroda, Tomohiro
2017-01-01
openEHR is a widely used EHR specification. Given its technology-independent nature, different approaches for implementing openEHR data repositories exist. Public openEHR datasets are needed to conduct benchmark analyses over different implementations. To address their current unavailability, we propose a method for generating openEHR test datasets that can be publicly shared and used.
ERIC Educational Resources Information Center
Hsiung, Chin-Min; Zheng, Xiang-Xiang
2015-01-01
The Measurements for Team Functioning (MTF) database contains a series of student academic performance measurements obtained at a national university in Taiwan. The measurements are acquired from unit tests and homework tests performed during a core mechanical engineering course, and provide an objective means of assessing the functioning of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forsberg, C.; Miller, W.F.
2013-07-01
The historical repository siting strategy in the United States has been a top-down approach driven by federal government decision making but it has been a failure. This policy has led to dispatching fuel cycle facilities in different states. The U.S. government is now considering an alternative repository siting strategy based on voluntary agreements with state governments. If that occurs, state governments become key decision makers. They have different priorities. Those priorities may change the characteristics of the repository and the fuel cycle. State government priorities, when considering hosting a repository, are safety, financial incentives and jobs. It follows that statesmore » will demand that a repository be the center of the back end of the fuel cycle as a condition of hosting it. For example, states will push for collocation of transportation services, safeguards training, and navy/private SNF (Spent Nuclear Fuel) inspection at the repository site. Such activities would more than double local employment relative to what was planned for the Yucca Mountain-type repository. States may demand (1) the right to take future title of the SNF so if recycle became economic the reprocessing plant would be built at the repository site and (2) the right of a certain fraction of the repository capacity for foreign SNF. That would open the future option of leasing of fuel to foreign utilities with disposal of the SNF in the repository but with the state-government condition that the front-end fuel-cycle enrichment and fuel fabrication facilities be located in that state.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slovic, P.; Layman, M.; Flynn, J.H.
1990-11-01
In July, 1989 the authors produced a report titled Perceived Risk, Stigma, and Potential Economic Impacts of a High-Level Nuclear-Waste Repository in Nevada (Slovic et al., 1989). That report described a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. It was concluded that adverse economic impacts potentially may result from two related social processes. Specifically, the study by Slovic et al. employed analyses of imagery in order to overcome concerns about the validity ofmore » direct questions regarding the influence of a nuclear-waste repository at Yucca Mountain upon a person`s future behaviors. During the latter months of 1989, data were collected in three major telephone surveys, designed to achieve the following objectives: (1) to replicate the results from the Phoenix, Arizona, surveys using samples from other populations that contribute to tourism, migration, and development in Nevada; (2) to retest the original Phoenix respondents to determine the stability of their images across an 18-month time period and to determine whether their vacation choices subsequent to the first survey were predictable from the images they produced in that original survey; (3) to elicit additional word-association images for the stimulus underground nuclear waste repository in order to determine whether the extreme negative images generated by the Phoenix respondents would occur with other samples of respondents; and (4) to develop and test a new method for imagery elicitation, based upon a rating technique rather than on word associations. 2 refs., 8 figs., 13 tabs.« less
NCI Mouse Repository | Frederick National Laboratory for Cancer Research
The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains
Hydrology of Yucca Mountain, Nevada
Flint, A.L.; Flint, L.E.; Kwicklis, E.M.; Bodvarsson, G.S.; Fabryka-Martin, J. M.
2001-01-01
Yucca Mountain, located in southern Nevada in the Mojave Desert, is being considered as a geologic repository for high-level radioactive waste. Although the site is arid, previous studies indicate net infiltration rates of 5-10 mm yr-1 under current climate conditions. Unsaturated flow of water through the mountain generally is vertical and rapid through the fractures of the welded tuffs and slow through the matrix of the nonwelded tuffs. The vitric-zeolitic boundary of the nonwelded tuffs below the potential repository, where it exists, causes perching and substantial lateral flow that eventually flows through faults near the eastern edge of the potential repository and recharges the underlying groundwater system. Fast pathways are located where water flows relatively quickly through the unsaturated zone to the water table. For the bulk of the water a large part of the travel time from land surface to the potential repository horizon (~300 m below land surface) is through the interlayered, low fracture density, nonwelded tuff where flow is predominately through the matrix. The unsaturated zone at Yucca Mountain is being modeled using a three-dimensional, dual-continuum numerical model to predict the results of measurements and observations in new boreholes and excavations. The interaction between experimentalists and modelers is providing confidence in the conceptual model and the numerical model and is providing researchers with the ability to plan further testing and to evaluate the usefulness or necessity of further data collection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-04-01
During the second half of fiscal year 1996, activities at the Yucca Mountain Site Characterization Project (Project) supported the objectives of the revised Program Plan released this period by the Office of Civilian Radioactive Waste Management of the US Department of Energy (Department). Outlined in the revised plan is a focused, integrated program of site characterization, design, engineering, environmental, and performance assessment activities that will achieve key Program and statutory objectives. The plan will result in the development of a license application for repository construction at Yucca Mountain, if the site is found suitable. Activities this period focused on twomore » of the three near-term objectives of the revised plan: updating in 1997 the regulatory framework for determining the suitability of the site for the proposed repository concept and providing information for a 1998 viability assessment of continuing toward the licensing of a repository. The Project has also developed a new design approach that uses the advanced conceptual design published during the last reporting period as a base for developing a design that will support the viability assessment. The initial construction phase of the Thermal Testing Facility was completed and the first phase of the in situ heater tests began on schedule. In addition, phase-one construction was completed for the first of two alcoves that will provide access to the Ghost Dance fault.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, P.
2014-09-23
GRAPE is a tool for managing software project workflows for the Git version control system. It provides a suite of tools to simplify and configure branch based development, integration with a project's testing suite, and integration with the Atlassian Stash repository hosting tool.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... Site characterization includes borings, surface excavations, excavation of exploratory shafts, limited subsurface lateral excavations and borings, and in situ testing at depth needed to determine the suitability of the site for a geologic repository, but does not include preliminary borings and geophysical...
Code of Federal Regulations, 2013 CFR
2013-01-01
.... Site characterization includes borings, surface excavations, excavation of exploratory shafts, limited subsurface lateral excavations and borings, and in situ testing at depth needed to determine the suitability of the site for a geologic repository, but does not include preliminary borings and geophysical...
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Site characterization includes borings, surface excavations, excavation of exploratory shafts, limited subsurface lateral excavations and borings, and in situ testing at depth needed to determine the suitability of the site for a geologic repository, but does not include preliminary borings and geophysical...
Batlle, J Vives I; Sweeck, L; Wannijn, J; Vandenhove, H
2016-10-01
The potential radiological impact of releases from a low-level radioactive waste (Category A waste) repository in Dessel, Belgium on the local fauna and flora was assessed under a reference scenario for gradual leaching. The potential impact situations for terrestrial and aquatic fauna and flora considered in this study were soil contamination due to irrigation with contaminated groundwater from a well at 70 m from the repository, contamination of the local wetlands receiving the highest radionuclide flux after migration through the aquifer and contamination of the local river receiving the highest radionuclide flux after migration through the aquifer. In addition, an exploratory study was carried out for biota residing in the groundwater. All impact assessments were performed using the Environmental Risk from Ionising Contaminants: Assessment and Management (ERICA) tool. For all scenarios considered, absorbed dose rates to biota were found to be well below the ERICA 10 μGy h -1 screening value. The highest dose rates were observed for the scenario where soil was irrigated with groundwater from the vicinity of the repository. For biota residing in the groundwater well, a few dose rates were slightly above the screening level but significantly below the dose rates at which the smallest effects are observed for those relevant species or groups of species. Given the conservative nature of the assessment, it can be concluded that manmade radionuclides deposited into the environment by the near surface disposal of category A waste at Dessel do not have a significant radiological impact to wildlife. Copyright © 2016 Elsevier Ltd. All rights reserved.
Proposed BioRepository platform solution for the ALS research community.
Sherman, Alex; Bowser, Robert; Grasso, Daniela; Power, Breen; Milligan, Carol; Jaffa, Matthew; Cudkowicz, Merit
2011-01-01
ALS is a rare disorder whose cause and pathogenesis is largely unknown ( 1 ). There is a recognized need to develop biomarkers for ALS to better understand the disease, expedite diagnosis and to facilitate therapy development. Collaboration is essential to obtain a sufficient number of samples to allow statistically meaningful studies. The availability of high quality biological specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens. The value of biological samples to scientists and clinicians correlates with the completeness and relevance of phenotypical and clinical information associated with the samples ( 2 , 3 ). While developing a secure Web-based system to manage an inventory of multi-site BioRepositories, algorithms were implemented to facilitate ad hoc parametric searches across heterogeneous data sources that contain data from clinical trials and research studies. A flexible schema for a barcode label was introduced to allow association of samples to these data. The ALSBank™ BioRepository platform solution for managing biological samples and associated data is currently deployed by the Northeast ALS Consortium (NEALS). The NEALS Consortium and the Massachusetts General Hospital (MGH) Neurology Clinical Trials Unit (NCTU) support a network of multiple BioBanks, thus allowing researchers to take advantage of a larger specimen collection than they might have at an individual institution. Standard operating procedures are utilized at all collection sites to promote common practices for biological sample integrity, quality control and associated clinical data. Utilizing this platform, we have created one of the largest virtual collections of ALS-related specimens available to investigators studying ALS.
Coupled THMC models for bentonite in clay repository for nuclear waste
NASA Astrophysics Data System (ADS)
Zheng, L.; Rutqvist, J.; Birkholzer, J. T.; Li, Y.; Anguiano, H. H.
2015-12-01
Illitization, the transformation of smectite to illite, could compromise some beneficiary features of an engineered barrier system (EBS) that is composed primarily of bentonite and clay host rock. It is a major determining factor to establish the maximum design temperature of the repositories because it is believed that illitization could be greatly enhanced at temperatures higher than 100 oC and thus significantly lower the sorption and swelling capacity of bentonite and clay rock. However, existing experimental and modeling studies on the occurrence of illitization and related performance impacts are not conclusive, in part because the relevant couplings between the thermal, hydrological, chemical, and mechanical (THMC) processes have not been fully represented in the models. Here we present fully coupled THMC simulations of a generic nuclear waste repository in a clay formation with bentonite-backfilled EBS. Two scenarios were simulated for comparison: a case in which the temperature in the bentonite near the waste canister can reach about 200 oC and a case in which the temperature in the bentonite near the waste canister peaks at about 100 oC. The model simulations demonstrate that illitization is in general more significant at higher temperatures. We also compared the chemical changes and the resulting swelling stress change for two types of bentonite: Kunigel-VI and FEBEX bentonite. Higher temperatures also lead to much higher stress in the near field, caused by thermal pressurization and vapor pressure buildup in the EBS bentonite and clay host rock. Chemical changes lead to a reduction in swelling stress, which is more pronounced for Kunigel-VI bentonite than for FEBEX bentonite.
D Webgis and Visualization Issues for Architectures and Large Sites
NASA Astrophysics Data System (ADS)
De Amicis, R.; Conti, G.; Girardi, G.; Andreolli, M.
2011-09-01
Traditionally, within the field of archaeology and, more generally, within the cultural heritage domain, Geographical Information Systems (GIS) have been mostly used as support to cataloguing activities, essentially operating as gateways to large geo-referenced archives of specialised cultural heritage information. Additionally GIS have proved to be essential to help cultural heritage institutions improve management of their historical information, providing the means for detection of otherwise hard-to-discover spatial patterns, supporting with computation tools necessary to perform spatial clustering, proximity and orientation analysis. This paper presents a platform developed to answer to both the aforementioned issues, by allowing geo-referenced cataloguing of multi-media resources of cultural relevance as well as access, in a user-friendly manner, through an interactive 3D geobrowser which operates as single point of access to the available digital repositories. The solution has been showcased in the context of "Festival dell'economia" (the Fair of Economics) a major event recently occurred in Trento, Italy and it has allowed visitors of the event to interactively access an extremely large repository of information, as well as their metadata, available across the area of the Autonomous Province of Trento, in Italy. Within the event, an extremely large repository was made accessible, via the network, through web-services, from a 3D interactive geobrowser developed by the authors. The 3D scene was enriched with a number of Points of Interest (POIs) linking to information available within various databases. The software package was deployed with a complex hardware set-up composed of a large composite panoramic screen covering a horizontal field of view of 240 degrees.
Establishment and evolution of the Australian Inherited Retinal Disease Register and DNA Bank.
De Roach, John N; McLaren, Terri L; Paterson, Rachel L; O'Brien, Emily C; Hoffmann, Ling; Mackey, David A; Hewitt, Alex W; Lamey, Tina M
2013-07-01
Inherited retinal disease represents a significant cause of blindness and visual morbidity worldwide. With the development of emerging molecular technologies, accessible and well-governed repositories of data characterising inherited retinal disease patients is becoming increasingly important. This manuscript introduces such a repository. Participants were recruited from the Retina Australia membership, through the Royal Australian and New Zealand College of Ophthalmologists, and by recruitment of suitable patients attending the Sir Charles Gairdner Hospital visual electrophysiology clinic. Four thousand one hundred ninety-three participants were recruited. All participants were members of families in which the proband was diagnosed with an inherited retinal disease (excluding age-related macular degeneration). Clinical and family information was collected by interview with the participant and by examination of medical records. In 2001, we began collecting DNA from Western Australian participants. In 2009 this activity was extended Australia-wide. Genetic analysis results were stored in the register as they were obtained. The main outcome measurement was the number of DNA samples (with associated phenotypic information) collected from Australian inherited retinal disease-affected families. DNA was obtained from 2873 participants. Retinitis pigmentosa, Stargardt disease and Usher syndrome participants comprised 61.0%, 9.9% and 6.4% of the register, respectively. This resource is a valuable tool for investigating the aetiology of inherited retinal diseases. As new molecular technologies are translated into clinical applications, this well-governed repository of clinical and genetic information will become increasingly relevant for tasks such as identifying candidates for gene-specific clinical trials. © 2012 The Authors. Clinical and Experimental Ophthalmology © 2012 Royal Australian and New Zealand College of Ophthalmologists.
Public involvement on closure of Asse II radioactive waste repository in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallenbach-Herbert, Beate
2013-07-01
From 1967 to 1978, about 125,800 barrels of low- and intermediate level waste were disposed of - nominally for research purposes - in the former 'Asse' salt mine which had before been used for the production of potash for many years. Since 1988 an inflow of brine is being observed which will cause dangers of flooding and of a collapse due to salt weakening and dissolution if it should increase. Since several years the closure of the Asse repository is planned with the objective to prevent the flooding and collapse of the mine and the release of radioactive substances tomore » the biosphere. The first concept that was presented by the former operator, however, seemed completely unacceptable to regional representatives from politics and NGOs. Their activities against these plans made the project a top issue on the political agenda from the federal to the local level. The paper traces the main reasons which lead to the severe safety problems in the past as well as relevant changes in the governance system today. A focus is put on the process for public involvement in which the Citizens' Advisory Group 'A2B' forms the core measure. Its structure and framework, experience and results, expectations from inside and outside perspectives are presented. Furthermore the question is tackled how far this process can serve as an example for a participatory approach in a siting process for a geological repository for high active waste which can be expected to be highly contested in the affected regions. (authors)« less
DR Argillite Disposal R&D at LBNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Liange; Kim, Kunhwi; Xu, Hao
2016-08-12
Within the Natural Barrier System (NBS) group of the Used Fuel Disposition (UFD) Campaign at the Department of Energy’s (DOE) Office of Nuclear Energy, LBNL’s research activities have focused on understanding and modeling EDZ evolution and the associated coupled processes and impacts of high temperature on parameters and processes relevant to performance of a clay repository to establish the technical base for the maximum allowable temperature. This report documents results from some of these activities. These activities address key Features, Events, and Processes (FEPs), which have been ranked in importance from medium to high, as listed in Table 7 ofmore » the Used Fuel Disposition Campaign Disposal Research and Development Roadmap (FCR&D-USED-2011-000065 REV0) (Nutt, 2011). Specifically, they address FEP 2.2.01, Excavation Disturbed Zone, for clay/shale, by investigating how coupled processes affect EDZ evolution; FEP 2.2.05, Flow and Transport Pathways; and FEP 2.2.08, Hydrologic Processes, and FEP 2.2.07, Mechanical Processes, and FEP 2.2.09, Chemical Process—Transport, by studying near-field coupled THMC processes in clay/shale repositories. The activities documented in this report also address a number of research topics identified in Research & Development (R&D) Plan for Used Fuel Disposition Campaign (UFDC) Natural System Evaluation and Tool Development (Wang 2011), including Topics S3, Disposal system modeling – Natural System; P1, Development of discrete fracture network (DFN) model; P14, Technical basis for thermal loading limits; and P15 Modeling of disturbed rock zone (DRZ) evolution (clay repository).« less
Database Resources of the BIG Data Center in 2018.
2018-01-04
The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Warehousing re-annotated cancer genes for biomarker meta-analysis.
Orsini, M; Travaglione, A; Capobianco, E
2013-07-01
Translational research in cancer genomics assigns a fundamental role to bioinformatics in support of candidate gene prioritization with regard to both biomarker discovery and target identification for drug development. Efforts in both such directions rely on the existence and constant update of large repositories of gene expression data and omics records obtained from a variety of experiments. Users who interactively interrogate such repositories may have problems in retrieving sample fields that present limited associated information, due for instance to incomplete entries or sometimes unusable files. Cancer-specific data sources present similar problems. Given that source integration usually improves data quality, one of the objectives is keeping the computational complexity sufficiently low to allow an optimal assimilation and mining of all the information. In particular, the scope of integrating intraomics data can be to improve the exploration of gene co-expression landscapes, while the scope of integrating interomics sources can be that of establishing genotype-phenotype associations. Both integrations are relevant to cancer biomarker meta-analysis, as the proposed study demonstrates. Our approach is based on re-annotating cancer-specific data available at the EBI's ArrayExpress repository and building a data warehouse aimed to biomarker discovery and validation studies. Cancer genes are organized by tissue with biomedical and clinical evidences combined to increase reproducibility and consistency of results. For better comparative evaluation, multiple queries have been designed to efficiently address all types of experiments and platforms, and allow for retrieval of sample-related information, such as cell line, disease state and clinical aspects. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Data repositories for medical education research: issues and recommendations.
Schwartz, Alan; Pappas, Cleo; Sandlow, Leslie J
2010-05-01
The authors explore issues surrounding digital repositories with the twofold intention of clarifying their creation, structure, content, and use, and considering the implementation of a global digital repository for medical education research data sets-an online site where medical education researchers would be encouraged to deposit their data in order to facilitate the reuse and reanalysis of the data by other researchers. By motivating data sharing and reuse, investigators, medical schools, and other stakeholders might see substantial benefits to their own endeavors and to the progress of the field of medical education.The authors review digital repositories in medicine, social sciences, and education, describe the contents and scope of repositories, and present extant examples. The authors describe the potential benefits of a medical education data repository and report results of a survey of the Society for Directors of Research in Medicine Education, in which participants responded to questions about data sharing and a potential data repository. Respondents strongly endorsed data sharing, with the caveat that principal investigators should choose whether or not to share data they collect. A large majority believed that a repository would benefit their unit and the field of medical education. Few reported using existing repositories. Finally, the authors consider challenges to the establishment of such a repository, including taxonomic organization, intellectual property concerns, human subjects protection, technological infrastructure, and evaluation standards. The authors conclude with recommendations for how a medical education data repository could be successfully developed.
48 CFR 227.7207 - Contractor data repositories.
Code of Federal Regulations, 2010 CFR
2010-10-01
... repositories. 227.7207 Section 227.7207 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to...
75 FR 70310 - Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
... Consumer Protection Act governing the security-based swap data repository registration process, the duties of such repositories, and the core principles applicable to such repositories. 4. The Commission will... security-based swap data repositories or the Commission and the public dissemination of security-based swap...
Relevance similarity: an alternative means to monitor information retrieval systems
Dong, Peng; Loh, Marie; Mondry, Adrian
2005-01-01
Background Relevance assessment is a major problem in the evaluation of information retrieval systems. The work presented here introduces a new parameter, "Relevance Similarity", for the measurement of the variation of relevance assessment. In a situation where individual assessment can be compared with a gold standard, this parameter is used to study the effect of such variation on the performance of a medical information retrieval system. In such a setting, Relevance Similarity is the ratio of assessors who rank a given document same as the gold standard over the total number of assessors in the group. Methods The study was carried out on a collection of Critically Appraised Topics (CATs). Twelve volunteers were divided into two groups of people according to their domain knowledge. They assessed the relevance of retrieved topics obtained by querying a meta-search engine with ten keywords related to medical science. Their assessments were compared to the gold standard assessment, and Relevance Similarities were calculated as the ratio of positive concordance with the gold standard for each topic. Results The similarity comparison among groups showed that a higher degree of agreements exists among evaluators with more subject knowledge. The performance of the retrieval system was not significantly different as a result of the variations in relevance assessment in this particular query set. Conclusion In assessment situations where evaluators can be compared to a gold standard, Relevance Similarity provides an alternative evaluation technique to the commonly used kappa scores, which may give paradoxically low scores in highly biased situations such as document repositories containing large quantities of relevant data. PMID:16029513
Core Certification of Data Repositories: Trustworthiness and Long-Term Stewardship
NASA Astrophysics Data System (ADS)
de Sherbinin, A. M.; Mokrane, M.; Hugo, W.; Sorvari, S.; Harrison, S.
2017-12-01
Scientific integrity and norms dictate that data created and used by scientists should be managed, curated, and archived in trustworthy data repositories thus ensuring that science is verifiable and reproducible while preserving the initial investment in collecting data. Research stakeholders including researchers, science funders, librarians, and publishers must also be able to establish the trustworthiness of data repositories they use to confirm that the data they submit and use remain useful and meaningful in the long term. Data repositories are increasingly recognized as a key element of the global research infrastructure and the importance of establishing their trustworthiness is recognised as a prerequisite for efficient scientific research and data sharing. The Core Trustworthy Data Repository Requirements are a set of universal requirements for certification of data repositories at the core level (see: https://goo.gl/PYsygW). They were developed by the ICSU World Data System (WDS: www.icsu-wds.org) and the Data Seal of Approval (DSA: www.datasealofapproval.org)—the two authoritative organizations responsible for the development and implementation of this standard to be further developed under the CoreTrustSeal branding . CoreTrustSeal certification of data repositories involves a minimally intensive process whereby repositories supply evidence that they are sustainable and trustworthy. Repositories conduct a self-assessment which is then reviewed by community peers. Based on this review CoreTrustSeal certification is granted by the CoreTrustSeal Standards and Certification Board. Certification helps data communities—producers, repositories, and consumers—to improve the quality and transparency of their processes, and to increase awareness of and compliance with established standards. This presentation will introduce the CoreTrustSeal certification requirements for repositories and offer an opportunity to discuss ways to improve the contribution of certified data repositories to sustain open data for open scientific research.
Prieto, Gorka; Fullaondo, Asier; Rodríguez, Jose A.
2016-01-01
Large-scale sequencing projects are uncovering a growing number of missense mutations in human tumors. Understanding the phenotypic consequences of these alterations represents a formidable challenge. In silico prediction of functionally relevant amino acid motifs disrupted by cancer mutations could provide insight into the potential impact of a mutation, and guide functional tests. We have previously described Wregex, a tool for the identification of potential functional motifs, such as nuclear export signals (NESs), in proteins. Here, we present an improved version that allows motif prediction to be combined with data from large repositories, such as the Catalogue of Somatic Mutations in Cancer (COSMIC), and to be applied to a whole proteome scale. As an example, we have searched the human proteome for candidate NES motifs that could be altered by cancer-related mutations included in the COSMIC database. A subset of the candidate NESs identified was experimentally tested using an in vivo nuclear export assay. A significant proportion of the selected motifs exhibited nuclear export activity, which was abrogated by the COSMIC mutations. In addition, our search identified a cancer mutation that inactivates the NES of the human deubiquitinase USP21, and leads to the aberrant accumulation of this protein in the nucleus. PMID:27174732
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hicks, R.J.; Van Voris, P.
1988-02-01
The primary objective was to review and evaluate the relevance and quality of existing xenobiotic data bases and test methods for evaluating direct and indirect effects (both adverse and beneficial) of xenobiotics on the soil microbial community; direct and indirect effects of the soil microbial community on xenobiotics; and adequacy of test methods used to evaluate these effects and interactions. Xenobiotic chemicals are defined here as those compounds, both organic and inorganic, produced by man and introduced into the environment at concentrations that cause undesirable effects. Because soil serves as the main repository for many of these chemicals, it thereforemore » has a major role in determining their ultimate fate. Once released, the distribution of xenobiotics between environmental compartments depends on the chemodynamic properties of the compounds, the physicochemical properties of the soils, and the transfer between soil-water and soil-air interfaces and across biological membranes. Abiotic and biotic processes can transform the chemical compound, thus altering its chemical state and, subsequently, its toxicity and reactivity. Ideally, the conversion is to carbon dioxide, water, and mineral elements, or at least, to some harmless substance. However, intermediate transformation products, which can become toxic pollutants in their own right, can sometimes be formed. 139 refs., 6 figs., 11 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valvoda, Z.; Holub, J.; Kucerka, M.
1996-12-31
In the year 1993, began the Program of Development of the Spent Fuel and High Level Waste Repository in the Conditions of the Czech Republic. During the first phase, the basic concept and structure of the Program has been developed, and the basic design criteria and requirements were prepared. In the conditions of the Czech Republic, only an underground repository in deep geological formation is acceptable. Expected depth is between 500 to 1000 meters and as host rock will be granites. A preliminary variant design study was realized in 1994, that analyzed the radioactive waste and spent fuel flow frommore » NPPs to the repository, various possibilities of transportation in accordance to the various concepts of spent fuel conditioning and transportation to the underground structures. Conditioning and encapsulation of spent fuel and/or radioactive waste is proposed on the repository site. Underground disposal structures are proposed at one underground floor. The repository will have reserve capacity for radioactive waste from NPPs decommissioning and for waste non acceptable to other repositories. Vertical disposal of unshielded canisters in boreholes and/or horizontal disposal of shielded canisters is studied. As the base term of the start up of the repository operation, the year 2035 has been established. From this date, a preliminary time schedule of the Project has been developed. A method of calculating leveled and discounted costs within the repository lifetime, for each of selected 5 variants, was used for economic calculations. Preliminary expected parametric costs of the repository are about 0,1 Kc ($0.004) per MWh, produced in the Czech NPPs. In 1995, the design and feasibility study has gone in more details to the technical concept of repository construction and proposed technologies, as well as to the operational phase of the repository. Paper will describe results of the 1995 design work and will present the program of the repository development in next period.« less
17 CFR 49.26 - Disclosure requirements of swap data repositories.
Code of Federal Regulations, 2014 CFR
2014-04-01
... data repository's policies and procedures reasonably designed to protect the privacy of any and all... swap data repositories. 49.26 Section 49.26 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data...
Making research data repositories visible: the re3data.org Registry.
Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe
2013-01-01
Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org-Registry of Research Data Repositories-has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data.
Ferguson, Adam R.; Popovich, Phillip G.; Xu, Xiao-Ming; Snow, Diane M.; Igarashi, Michihiro; Beattie, Christine E.; Bixby, John L.
2014-01-01
Abstract The lack of reproducibility in many areas of experimental science has a number of causes, including a lack of transparency and precision in the description of experimental approaches. This has far-reaching consequences, including wasted resources and slowing of progress. Additionally, the large number of laboratories around the world publishing articles on a given topic make it difficult, if not impossible, for individual researchers to read all of the relevant literature. Consequently, centralized databases are needed to facilitate the generation of new hypotheses for testing. One strategy to improve transparency in experimental description, and to allow the development of frameworks for computer-readable knowledge repositories, is the adoption of uniform reporting standards, such as common data elements (data elements used in multiple clinical studies) and minimum information standards. This article describes a minimum information standard for spinal cord injury (SCI) experiments, its major elements, and the approaches used to develop it. Transparent reporting standards for experiments using animal models of human SCI aim to reduce inherent bias and increase experimental value. PMID:24870067
10 CFR 60.15 - Site characterization.
Code of Federal Regulations, 2014 CFR
2014-01-01
... in situ testing before and during construction shall be planned and coordinated with geologic... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... be described in such application. (b) Unless the Commission determines with respect to the site...
10 CFR 60.15 - Site characterization.
Code of Federal Regulations, 2013 CFR
2013-01-01
... in situ testing before and during construction shall be planned and coordinated with geologic... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... be described in such application. (b) Unless the Commission determines with respect to the site...
10 CFR 60.15 - Site characterization.
Code of Federal Regulations, 2012 CFR
2012-01-01
... in situ testing before and during construction shall be planned and coordinated with geologic... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... be described in such application. (b) Unless the Commission determines with respect to the site...
10 CFR 60.15 - Site characterization.
Code of Federal Regulations, 2011 CFR
2011-01-01
... in situ testing before and during construction shall be planned and coordinated with geologic... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... be described in such application. (b) Unless the Commission determines with respect to the site...
17 CFR 49.19 - Core principles applicable to registered swap data repositories.
Code of Federal Regulations, 2014 CFR
2014-04-01
... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with core principles. To be registered, and maintain...
17 CFR 49.26 - Disclosure requirements of swap data repositories.
Code of Federal Regulations, 2013 CFR
2013-04-01
... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Disclosure requirements of...
17 CFR 49.26 - Disclosure requirements of swap data repositories.
Code of Federal Regulations, 2012 CFR
2012-04-01
... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Disclosure requirements of...
21 CFR 522.480 - Repository corticotropin injection.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Repository corticotropin injection. 522.480 Section 522.480 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... § 522.480 Repository corticotropin injection. (a)(1) Specifications. The drug conforms to repository...
10 CFR 960.3-1-3 - Regionality.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Implementation Guidelines § 960.3-1-3 Regionality. In making site recommendations for repository development after the site for the first repository has been recommended, the Secretary shall give due... repositories. Such consideration shall take into account the proximity of sites to locations at which waste is...
Hydrologic testing of tight zones in southeastern New Mexico.
Dennehy, K.F.; Davis, P.A.
1981-01-01
Increased attention is being directed toward the investigation of tight zones in relation to the storage and disposal of hazardous wastes. Shut-in tests, slug tests, and pressure-slug tests are being used at the proposed Waste Isolation Pilot Plant site, New Mexico, to evaluate the fluid-transmitting properties of several zones above the proposed repository zone. All three testing methods were used in various combinations to obtain values for the hydraulic properties of the test zones. Multiple testing on the same zone produced similar results. -from Authors
Immobilization of Technetium in a Metallic Waste Form
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.M. Frank; D. D. Keiser, Jr.; K. C. Marsden
Fission-product technetium accumulated during treatment of spent nuclear fuel will ultimately be disposed of in a geological repository. The exact form of Tc for disposal has yet to be determined; however, a reasonable solution is to incorporate elemental Tc into a metallic waste form similar to the waste form produced during the pyrochemical treatment of spent, sodium-bonded fuel. This metal waste form, produced at the Idaho National Laboratory, has undergone extensive qualification examination and testing for acceptance to the Yucca Mountain geological repository. It is from this extensive qualification effort that the behavior of Tc and other fission products inmore » the waste form has been elucidated, and that the metal waste form is extremely robust in the retention of fission products, such as Tc, in repository like conditions. This manuscript will describe the metal waste form, the behavior of Tc in the waste form; and current research aimed at determining the maximum possible loading of Tc into the metal waste and subsequent determination of the performance of high Tc loaded metal waste forms.« less
Nuclear Waste Facing the Test of Time: The Case of the French Deep Geological Repository Project.
Poirot-Delpech, Sophie; Raineau, Laurence
2016-12-01
The purpose of this article is to consider the socio-anthropological issues raised by the deep geological repository project for high-level, long-lived nuclear waste. It is based on fieldwork at a candidate site for a deep storage project in eastern France, where an underground laboratory has been studying the feasibility of the project since 1999. A project of this nature, based on the possibility of very long containment (hundreds of thousands of years, if not longer), involves a singular form of time. By linking project performance to geology's very long timescale, the project attempts "jump" in time, focusing on a far distant future, without understanding it in terms of generations. But these future generations remain measurements of time on the surface, where the issue of remembering or forgetting the repository comes to the fore. The nuclear waste geological storage project raises questions that neither politicians nor scientists, nor civil society, have ever confronted before. This project attempts to address a problem that exists on a very long timescale, which involves our responsibility toward generations in the far future.
State Assessment Program Item Banks: Model Language for Request for Proposals (RFP) and Contracts
ERIC Educational Resources Information Center
Swanson, Leonard C.
2010-01-01
This document provides recommendations for request for proposal (RFP) and contract language that state education agencies can use to specify their requirements for access to test item banks. An item bank is a repository for test items and data about those items. Item banks are used by state agency staff to view items and associated data; to…
17 CFR 49.19 - Core principles applicable to registered swap data repositories.
Code of Federal Regulations, 2013 CFR
2013-04-01
... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with core principles. To be registered, and maintain registration, a swap data...
17 CFR 49.19 - Core principles applicable to registered swap data repositories.
Code of Federal Regulations, 2012 CFR
2012-04-01
... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with Core Principles. To be registered, and maintain registration, a swap data...
17 CFR 49.22 - Chief compliance officer.
Code of Federal Regulations, 2014 CFR
2014-04-01
... that the registered swap data repository provide fair and open access as set forth in § 49.27 of this...) SWAP DATA REPOSITORIES § 49.22 Chief compliance officer. (a) Definition of Board of Directors. For... data repository, or for those swap data repositories whose organizational structure does not include a...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
... Construction of a Waste Repository on the Settlors' Property Pursuant to the Comprehensive Environmental... a Settlement Agreement pertaining to Construction of a Waste Repository on Settlor's Property... waste repository on the property by resolving, liability the settling party might otherwise incur under...
10 CFR 51.67 - Environmental information concerning geologic repositories.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Environmental information concerning geologic repositories... information concerning geologic repositories. (a) In lieu of an environmental report, the Department of Energy... connection with any geologic repository developed under Subtitle A of Title I, or under Title IV, of the...
15 CFR 1180.10 - NTIS permanent repository.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false NTIS permanent repository. 1180.10... ENGINEERING INFORMATION TO THE NATIONAL TECHNICAL INFORMATION SERVICE § 1180.10 NTIS permanent repository. A... repository as a service to agencies unless the Director advises the Liaison Officer that it has not been so...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
...] Center for Devices and Radiological Health 510(k) Implementation: Online Repository of Medical Device... public meeting entitled ``510(k) Implementation: Discussion of an Online Repository of Medical Device... establish an online public repository of medical device labeling and strategies for displaying device...
Identifying Tensions in the Use of Open Licenses in OER Repositories
ERIC Educational Resources Information Center
Amiel, Tel; Soares, Tiago Chagas
2016-01-01
We present an analysis of 50 repositories for educational content conducted through an "audit system" that helped us classify these repositories, their software systems, promoters, and how they communicated their licensing practices. We randomly accessed five resources from each repository to investigate the alignment of licensing…
10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository after...-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical Criteria Postclosure Performance Objectives § 63.113 Performance objectives for the geologic repository after permanent...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository... (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA... repository operations area through permanent closure. (a) Protection against radiation exposures and releases...
10 CFR 960.3-1 - Siting provisions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Implementation Guidelines § 960.3-1 Siting provisions. The siting provisions establish the... repositories. As required by the Act, § 960.3-1-3 specifies consideration of a regional distribution of repositories after recommendation of a site for development of the first repository. Section 960.3-1-4...
Code of Federal Regulations, 2010 CFR
2010-01-01
... repository after permanent closure. 60.112 Section 60.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.112 Overall system performance objective for the geologic repository after permanent closure...
Code of Federal Regulations, 2010 CFR
2010-01-01
... geologic repository operations area. 60.132 Section 60.132 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.132 Additional design criteria for surface facilities in...
10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Performance of the geologic repository operations area... OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.111 Performance of the geologic repository operations area through permanent closure. (a...
Institutional Repositories as Infrastructures for Long-Term Preservation
ERIC Educational Resources Information Center
Francke, Helena; Gamalielsson, Jonas; Lundell, Björn
2017-01-01
Introduction: The study describes the conditions for long-term preservation of the content of the institutional repositories of Swedish higher education institutions based on an investigation of how deposited files are managed with regards to file format and how representatives of the repositories describe the functions of the repositories.…
Personal Name Identification in the Practice of Digital Repositories
ERIC Educational Resources Information Center
Xia, Jingfeng
2006-01-01
Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…
Evaluation of Used Fuel Disposition in Clay-Bearing Rock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jové Colón, Carlos F.; Weck, Philippe F.; Sassani, David H.
2014-08-01
Radioactive waste disposal in shale/argillite rock formations has been widely considered given its desirable isolation properties (low permeability), geochemically reduced conditions, anomalous groundwater pressures, and widespread geologic occurrence. Clay/shale rock formations are characterized by their high content of clay minerals such as smectites and illites where diffusive transport and chemisorption phenomena predominate. These, in addition to low permeability, are key attributes of shale to impede radionuclide mobility. Shale host-media has been comprehensively studied in international nuclear waste repository programs as part of underground research laboratories (URLs) programs in Switzerland, France, Belgium, and Japan. These investigations, in some cases a decademore » or more long, have produced a large but fundamental body of information spanning from site characterization data (geological, hydrogeological, geochemical, geomechanical) to controlled experiments on the engineered barrier system (EBS) (barrier clay and seals materials). Evaluation of nuclear waste disposal in shale formations in the USA was conducted in the late 70’s and mid 80’s. Most of these studies evaluated the potential for shale to host a nuclear waste repository but not at the programmatic level of URLs in international repository programs. This report covers various R&D work and capabilities relevant to disposal of heat-generating nuclear waste in shale/argillite media. Integration and cross-fertilization of these capabilities will be utilized in the development and implementation of the shale/argillite reference case planned for FY15. Disposal R&D activities under the UFDC in the past few years have produced state-of-the-art modeling capabilities for coupled Thermal-Hydrological-Mechanical-Chemical (THMC), used fuel degradation (source term), and thermodynamic modeling and database development to evaluate generic disposal concepts. The THMC models have been developed for shale repository leveraging in large part on the information garnered in URLs and laboratory data to test and demonstrate model prediction capability and to accurately represent behavior of the EBS and the natural (barrier) system (NS). In addition, experimental work to improve our understanding of clay barrier interactions and TM couplings at high temperatures are key to evaluate thermal effects as a result of relatively high heat loads from waste and the extent of sacrificial zones in the EBS. To assess the latter, experiments and modeling approaches have provided important information on the stability and fate of barrier materials under high heat loads. This information is central to the assessment of thermal limits and the implementation of the reference case when constraining EBS properties and the repository layout (e.g., waste package and drift spacing). This report is comprised of various parts, each one describing various R&D activities applicable to shale/argillite media. For example, progress made on modeling and experimental approaches to analyze physical and chemical interactions affecting clay in the EBS, NS, and used nuclear fuel (source term) in support of R&D objectives. It also describes the development of a reference case for shale/argillite media. The accomplishments of these activities are summarized as follows: Development of a reference case for shale/argillite; Investigation of Reactive Transport and Coupled THM Processes in EBS: FY14; Update on Experimental Activities on Buffer/Backfill Interactions at elevated Pressure and Temperature; and Thermodynamic Database Development: Evaluation Strategy, Modeling Tools, First-Principles Modeling of Clay, and Sorption Database Assessment;ANL Mixed Potential Model For Used Fuel Degradation: Application to Argillite and Crystalline Rock Environments.« less
Building locally relevant ethics curricula for nursing education in Botswana.
Barchi, F; Kasimatis Singleton, M; Magama, M; Shaibu, S
2014-12-01
The goal of this multi-institutional collaboration was to develop an innovative, locally relevant ethics curriculum for nurses in Botswana. Nurses in Botswana face ethical challenges that are compounded by lack of resources, pressures to handle tasks beyond training or professional levels, workplace stress and professional isolation. Capacity to teach nursing ethics in the classroom and in professional practice settings has been limited. A pilot curriculum, including cases set in local contexts, was tested with nursing faculty in Botswana in 2012. Thirty-three per cent of the faculty members indicated they would be more comfortable teaching ethics. A substantial number of faculty members were more likely to introduce the International Council of Nurses Code of Ethics in teaching, practice and mentoring as a result of the training. Based on evaluation data, curricular materials were developed using the Code and the regulatory requirements for nursing practice in Botswana. A web-based repository of sample lectures, discussion cases and evaluation rubrics was created to support the use of the materials. A new master degree course, Nursing Ethics in Practice, has been proposed for fall 2015 at the University of Botswana. The modular nature of the materials and the availability of cases set within the context of clinical nurse practice in Botswana make them readily adaptable to various student academic levels and continuing professional development programmes. The ICN Code of Ethics for Nursing is a valuable teaching tool in developing countries when taught using locally relevant case materials and problem-based teaching methods. The approach used in the development of a locally relevant nursing ethics curriculum in Botswana can serve as a model for nursing education and continuing professional development programmes in other sub-Saharan African countries to enhance use of the ICN Code of Ethics in nursing practice. © 2014 International Council of Nurses.
System and method for responding to ground and flight system malfunctions
NASA Technical Reports Server (NTRS)
Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)
2010-01-01
A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.
M4SF-17LL010302072: The Roles of Diffusion and Corrosion in Radionuclide Retardation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavarin, Mavrik; Balboni, E.; Atkins-Duffin, Cindy
This progress report (Level 4 Milestone Number M4SF-17LL010302072) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Crystalline Disposal R&D Activity Number M4SF-17LL01030207 and Crystalline International Collaborations Activity Number M4SF-17LL01030208. The focus of this research is the interaction of radionuclides with Engineered Barrier System (EBS) and host rock materials at various physicochemical conditions relevant to subsurface repository environments. They include both chemical and physical processes such as solubility, sorption, and diffusion.
NASA Astrophysics Data System (ADS)
Wyborn, Lesley; Evans, Ben
2016-04-01
Collecting data for the Earth Sciences has a particularly long history going back centuries. Initially scientific data came only from simple human observations recorded by pen on paper. Scientific instruments soon supplemented data capture, and as these instruments became more capable (e.g, automation, more information captured, generation of digitally-born outputs), Earth Scientists entered the 'Big Data' era where progressively data became too big to store and process locally in the old style vaults. To date, most funding initiatives for collection and storage of large volume data sets in the Earth Sciences have been specialised within a single discipline (e.g., climate, geophysics, and Earth Observation) or specific to an individual institution. To undertake interdisciplinary research, it is hard for users to integrate data from these individual repositories mainly due to limitations on physical access to/movement of the data, and/or data being organised without enough information to make sense of it without discipline specialised knowledge. Smaller repositories have also gradually been seen as inefficient in terms of the cost to manage and access (including scarce skills) and effective implementation of new technology and techniques. Within the last decade, the trend is towards fewer and larger data repositories that increasingly are collocated with HPC/cloud resources. There has also been a growing recognition that digital data can be a valuable resource that can be reused and repurposed - publicly funded data from either the academic of government sector is seen as a shared resource, and that efficiencies can be gained by co-location. These new, highly capable, 'transdisciplinary' data repositories are emerging as a fundamental 'infrastructure' both for research and other innovation. The sharing of academic and government data resources on the same infrastructures is enabling new research programmes that will enable integration beyond the traditional physical scientific domain silos, including into the humanities and social sciences. Furthermore there is increasing desire for these 'Big Data' data infrastructures to prove their value not only as platforms for scientific discovery, but to also support the development of evidence-based government policies, economic growth, and private-sector opportunities. The capacity of these transdisciplinary data repositories leads to many new exciting opportunities for the next generation of large-scale data integration, but there is an emerging suite of data challenges that now need to be tackled. Many large volume data sets have historically been developed within traditional domain silos and issues such as difference of standards (informal and formal), the data conventions, the lack of controlled or even uniform vocabularies, the non-existent/not machine-accessible semantic information, and bespoke or unclear copyrights and licensing are becoming apparent. The different perspectives and approaches of the various communities have also started to come to the fore; particularly the dominant file based approach of the big data generating science communities versus the database approach of the point observational communities; and the multidimensional approach of the climate and oceans community versus the traditional 2D approach of the GIS/spatial community. Addressing such challenges is essential to fully unlock online access to all relevant data to enable the maturing of research to the transdisciplinary paradigm.
SEDIMENT ASSESSMENT WITH THE BIVALVE MULINIA LATERALIS: MAXIMIZING TEST ORGANISM PROTECTION
Estuarine and marine sediments are a major repository for many of the more persistent chemicals introduced into surface waters. Approaches used by USEPA to identify a national inventory of contaminated sediment sites include, among other tools, whole-sediment toxicity (presently ...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 63.131 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 63.131 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...
10 CFR 63.131 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...
10 CFR 63.131 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...
10 CFR 60.140 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...
10 CFR 63.131 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...
17 CFR 49.9 - Duties of registered swap data repositories.
Code of Federal Regulations, 2014 CFR
2014-04-01
... privacy of any and all swap data and any other related information that the swap data repository receives... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Duties of registered swap data... (CONTINUED) SWAP DATA REPOSITORIES § 49.9 Duties of registered swap data repositories. (a) Duties. To be...
Availability and Accessibility in an Open Access Institutional Repository: A Case Study
ERIC Educational Resources Information Center
Lee, Jongwook; Burnett, Gary; Vandegrift, Micah; Baeg, Jung Hoon; Morris, Richard
2015-01-01
Introduction: This study explores the extent to which an institutional repository makes papers available and accessible on the open Web by using 170 journal articles housed in DigiNole Commons, the institutional repository at Florida State University. Method: To analyse the repository's impact on availability and accessibility, we conducted…
The Use of Digital Repositories for Enhancing Teacher Pedagogical Performance
ERIC Educational Resources Information Center
Cohen, Anat; Kalimi, Sharon; Nachmias, Rafi
2013-01-01
This research examines the usage of local learning material repositories at school, as well as related teachers' attitudes and training. The study investigates the use of these repositories for enhancing teacher performance and assesses whether the assimilation of the local repositories increases their usage of and contribution to by teachers. One…
Institutional Repositories in Indian Universities and Research Institutes: A Study
ERIC Educational Resources Information Center
Krishnamurthy, M.; Kemparaju, T. D.
2011-01-01
Purpose: The purpose of this paper is to report on a study of the institutional repositories (IRs) in use in Indian universities and research institutes. Design/methodology/approach: Repositories in various institutions in India were accessed and described in a standardised way. Findings: The 20 repositories studied covered collections of diverse…
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Emergency plan for the geologic repository operations area... OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Emergency Planning Criteria § 63.161 Emergency plan for the geologic repository operations area through permanent...
77 FR 26709 - Swap Data Repositories: Interpretative Statement Regarding the Confidentiality and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 49 RIN 3038-AD83 Swap Data Repositories... data repositories (``SDRs'').SDRs are new registered entities created by section 728 of the Dodd-Frank... Act amends section 1a of the CEA to add a definition of the term ``swap data repository.'' Pursuant to...
Online Paper Repositories and the Role of Scholarly Societies: An AERA Conference Report
ERIC Educational Resources Information Center
Educational Researcher, 2010
2010-01-01
This article examines issues faced by scholarly societies that are developing and sustaining online paper repositories. It is based on the AERA Conference on Online Paper Repositories, which focused on fundamental issues of policy and procedure important to the operations of online working paper repositories. The report and recommendations address…
10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... of each repository site. For the second repository, at least three of the sites shall not have been nominated previously. Any site nominated as suitable for characterization for the first repository, but not...
Repositories for Research: Southampton's Evolving Role in the Knowledge Cycle
ERIC Educational Resources Information Center
Simpson, Pauline; Hey, Jessie
2006-01-01
Purpose: To provide an overview of how open access (OA) repositories have grown to take a premier place in the e-research knowledge cycle and offer Southampton's route from project to sustainable institutional repository. Design/methodology/approach: The evolution of institutional repositories and OA is outlined raising questions of multiplicity…
Criteria for the evaluation and certification of long-term digital archives in the earth sciences
NASA Astrophysics Data System (ADS)
Klump, Jens
2010-05-01
Digital information has become an indispensable part of our cultural and scientific heritage. Scientific findings, historical documents and cultural achievements are to a rapidly increasing extent being presented in electronic form - in many cases exclusively so. However, besides the invaluable advantages offered by this form, it also carries a serious disadvantage: users need to invest a great deal of technical effort in accessing the information. Also, the underlying technology is still undergoing further development at an exceptionally fast pace. The rapid obsolescence of the technology required to read the information combined with the frequently imperceptible physical decay of the media themselves represents a serious threat to preservation of the information content. Many data sets in earth science research are from observations that cannot be repeated. This makes these digital assets particularly valuable. Therefore, these data should be kept and made available for re-use long after the end of the project from which they originated. Since research projects only run for a relatively short period of time, it is advisable to shift the burden of responsibility for long-term data curation from the individual researcher to a trusted data repository or archive. But what makes a trusted data repository? Each trusted digital repository has its own targets and specifications. The trustworthiness of digital repositories can be tested and assessed on the basis of a criteria catalogue. This is the main focus of the work of the nestor working group "Trusted repositories - Certification". It identifies criteria which permit the trustworthiness of a digital repository to be evaluated, both at the organisational and technical levels. The criteria are defined in close collaboration with a wide range of different memory organisations, producers of information, experts and other interested parties. This open approach ensures a high degree of universal validity, suitability for daily practical use and also broad-based acceptance of the results. The criteria catalogue is also intended to present the option of documenting trustworthiness by means of certification in a standardised national or international process. The criteria catalogue is based on the Reference Model for an Open Archival Information System (OAIS, ISO 14721:2003) With its broad approach, the nestor criteria catalogue for trusted digital repositories has to remain on a high level of abstraction. For application in the earth sciences the evaluation criteria need to be transferred into the context of earth science data and their designated user community. This presentation offers a brief introduction to the problems surrounding the long-term preservation of digital objects. This introduction is followed by a proposed application of the criteria catalogue for trusted digital repositories to the context of earth science data and their long-term preservation.
Interoperability Across the Stewardship Spectrum in the DataONE Repository Federation
NASA Astrophysics Data System (ADS)
Jones, M. B.; Vieglais, D.; Wilson, B. E.
2016-12-01
Thousands of earth and environmental science repositories serve many researchers and communities, each with their own community and legal mandates, sustainability models, and historical infrastructure. These repositories span the stewardship spectrum from highly curated collections that employ large numbers of staff members to review and improve data, to small, minimal budget repositories that accept data caveat emptor and where all responsibility for quality lies with the submitter. Each repository fills a niche, providing services that meet the stewardship tradeoffs of one or more communities. We have reviewed these stewardship tradeoffs for several DataONE member repositories ranging from minimally (KNB) to highly curated (Arctic Data Center), as well as general purpose (Dryad) to highly discipline or project specific (NEON). The rationale behind different levels of stewardship reflect resolution of these tradeoffs. Some repositories aim to encourage extensive uptake by keeping processes simple and minimizing the amount of information collected, but this limits the long-term utility of the data and the search, discovery, and integration systems that are possible. Other repositories require extensive metadata input, review, and assessment, allowing for excellent preservation, discovery, and integration but at the cost of significant time for submitters and expense for curatorial staff. DataONE recognizes these different levels of curation, and attempts to embrace them to create a federation that is useful across the stewardship spectrum. DataONE provides a tiered model for repositories with growing utility of DataONE services at higher tiers of curation. The lowest tier supports read-only access to data and requires little more than title and contact metadata. Repositories can gradually phase in support for higher levels of metadata and services as needed. These tiered capabilities are possible through flexible support for multiple metadata standards and services, where repositories can incrementally increase their requirements as they want to satisfy more use cases. Within DataONE, metadata search services support minimal metadata models, but significantly expanded precision and recall become possible when repositories provide more extensively curated metadata.
Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadgu, Teklu; Karra, Satish; Kalinina, Elena
One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. In this paper, we compare DFN and ECM in termsmore » of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km 3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. Finally, we identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.« less
Hadgu, Teklu; Karra, Satish; Kalinina, Elena; ...
2017-07-28
One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. In this paper, we compare DFN and ECM in termsmore » of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km 3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. Finally, we identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.« less
NASA Astrophysics Data System (ADS)
Hadgu, Teklu; Karra, Satish; Kalinina, Elena; Makedonska, Nataliia; Hyman, Jeffrey D.; Klise, Katherine; Viswanathan, Hari S.; Wang, Yifeng
2017-10-01
One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. We compare DFN and ECM in terms of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. We identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.
Semantic Analysis of Email Using Domain Ontologies and WordNet
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Keller, Richard M.
2005-01-01
The problem of capturing and accessing knowledge in paper form has been supplanted by a problem of providing structure to vast amounts of electronic information. Systems that can construct semantic links for natural language documents like email messages automatically will be a crucial element of semantic email tools. We have designed an information extraction process that can leverage the knowledge already contained in an existing semantic web, recognizing references in email to existing nodes in a network of ontology instances by using linguistic knowledge and knowledge of the structure of the semantic web. We developed a heuristic score that uses several forms of evidence to detect references in email to existing nodes in the Semanticorganizer repository's network. While these scores cannot directly support automated probabilistic inference, they can be used to rank nodes by relevance and link those deemed most relevant to email messages.
ERIC Educational Resources Information Center
Chudnov, Daniel
2008-01-01
The author does not know the first thing about building digital repositories. Maybe that is a strange thing to say, given that he works in a repository development group now, worked on the original DSpace project years ago, and worked on a few repository research projects in between. Given how long he has been around people and projects aiming to…
USDA-ARS?s Scientific Manuscript database
The National Clonal Germplasm Repository (NCGR) in Davis is one among the nine repositories in the National Plant Germplasm System, USDA-ARS that is responsible for conservation of clonally propagated woody perennial subtropical and temperate fruit and nut crop germplasm. Currently the repository ho...
Code of Federal Regulations, 2010 CFR
2010-07-01
... repository possesses the capability to provide adequate long-term curatorial services. 79.9 Section 79.9... FEDERALLY-OWNED AND ADMINISTERED ARCHAEOLOGICAL COLLECTIONS § 79.9 Standards to determine when a repository... shall determine that a repository has the capability to provide adequate long-term curatorial services...
10 CFR Appendix II to Part 960 - NRC and EPA Requirements for Preclosure Repository Performance
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false NRC and EPA Requirements for Preclosure Repository... SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. II Appendix II to Part 960—NRC and EPA Requirements for Preclosure Repository Performance Under proposed 40 CFR part 191, subpart A...
Code of Federal Regulations, 2010 CFR
2010-01-01
... license with respect to a geologic repository. 51.109 Section 51.109 Energy NUCLEAR REGULATORY COMMISSION... Public hearings in proceedings for issuance of materials license with respect to a geologic repository... waste repository at a geologic repository operations area under parts 60 and 63 of this chapter, and in...
Semantic Linking of Learning Object Repositories to DBpedia
ERIC Educational Resources Information Center
Lama, Manuel; Vidal, Juan C.; Otero-Garcia, Estefania; Bugarin, Alberto; Barro, Senen
2012-01-01
Large-sized repositories of learning objects (LOs) are difficult to create and also to maintain. In this paper we propose a way to reduce this drawback by improving the classification mechanisms of the LO repositories. Specifically, we present a solution to automate the LO classification of the Universia repository, a collection of more than 15…
10 CFR Appendix I to Part 960 - NRC and EPA Requirements for Postclosure Repository Performance
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false NRC and EPA Requirements for Postclosure Repository... SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. I Appendix I to Part 960—NRC and EPA Requirements for Postclosure Repository Performance Under proposed 40 CFR part 191, subpart B...
Huser, Vojtech; Cimino, James J.
2013-01-01
Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366
Huser, Vojtech; Cimino, James J
2013-01-01
Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.
Social tagging in the life sciences: characterizing a new metadata resource for bioinformatics.
Good, Benjamin M; Tennis, Joseph T; Wilkinson, Mark D
2009-09-25
Academic social tagging systems, such as Connotea and CiteULike, provide researchers with a means to organize personal collections of online references with keywords (tags) and to share these collections with others. One of the side-effects of the operation of these systems is the generation of large, publicly accessible metadata repositories describing the resources in the collections. In light of the well-known expansion of information in the life sciences and the need for metadata to enhance its value, these repositories present a potentially valuable new resource for application developers. Here we characterize the current contents of two scientifically relevant metadata repositories created through social tagging. This investigation helps to establish how such socially constructed metadata might be used as it stands currently and to suggest ways that new social tagging systems might be designed that would yield better aggregate products. We assessed the metadata that users of CiteULike and Connotea associated with citations in PubMed with the following metrics: coverage of the document space, density of metadata (tags) per document, rates of inter-annotator agreement, and rates of agreement with MeSH indexing. CiteULike and Connotea were very similar on all of the measurements. In comparison to PubMed, document coverage and per-document metadata density were much lower for the social tagging systems. Inter-annotator agreement within the social tagging systems and the agreement between the aggregated social tagging metadata and MeSH indexing was low though the latter could be increased through voting. The most promising uses of metadata from current academic social tagging repositories will be those that find ways to utilize the novel relationships between users, tags, and documents exposed through these systems. For more traditional kinds of indexing-based applications (such as keyword-based search) to benefit substantially from socially generated metadata in the life sciences, more documents need to be tagged and more tags are needed for each document. These issues may be addressed both by finding ways to attract more users to current systems and by creating new user interfaces that encourage more collectively useful individual tagging behaviour.
NASA Astrophysics Data System (ADS)
Keall, Bethan; Koers, Hylke; Marques, David
2013-04-01
Research in the Earth & Planetary Sciences is characterized by a wealth of observational data - ranging from observations by satellites orbiting the Earth, to borehole measurements at the bottom of the ocean, and also includes data from projects like the Rover Curiosity Landing. Thanks to technological advancements, it has become much easier for researchers over the last few decades to gather large volumes of data, analyze, and share with other researchers inside and outside the lab. With data serving such an important role in the way research is carried out, it becomes a crucial task to archive, maintain, organize, and disseminate research data in a dependable and structured manner. Subject-specific data repositories, often driven by the scientific community, are taking an increasingly prominent role in this domain, getting traction amongst researchers as the go-to place to deposit raw research data. At the same time, the scientific article remains an essential resource of scientific information. At Elsevier, we strive to continuously adapt the article format to meet the needs of modern-day researchers. This includes better support for digital content (see, e.g., http://www.elsevier.com/googlemaps), but also bidirectional linking between online articles and data repositories. In this spirit, Elsevier is collaborating with several leading data repositories, such as PANGAEA, IEDA, and NERC, to interlink articles and data for improved visibility and discoverability of both primary research data and research articles. In addition, Elsevier has formed a new group, Research Data Services, with three primary goals: • help increase the sharing and archiving of research data in discipline-specific repositories • help increase the value of shared data, particularly with annotation and provenance metadata and linking discipline-specific datasets together • help create a credit and impact assessment infrastructure to make research data independently important in its own right. We are working on several initiatives at Elsevier that enhance the online article format, and to make it easier for researchers to share, find, access, link together and analyze relevant research data. This helps to increase the value of both articles and data, and enables researchers to gain full credit for their research data output.
NASA Astrophysics Data System (ADS)
Zheng, L.; Rutqvist, J.; Birkholzer, J. T.; Liu, H. H.
2014-12-01
Geological repositories for disposal of high-level nuclear waste generally rely on a multi-barrier system to isolate radioactive waste from the biosphere. An engineered barrier system (EBS), which comprises in many design concepts a bentonite backfill, is widely used. Clay formations have been considered as a host rock throughout the world. Illitization, the transformation of smectite to illite, could compromise some beneficiary features of EBS bentonite and clay host rock such as sorption and swelling capacity. It is the major determining factor to establish the maximum design temperature of the repositories because it is believed that illitization could be greatly enhanced at temperatures higher than 100 oC. However, existing experimental and modeling studies on the occurrence of illitization and related performance impacts are not conclusive, in part because the relevant couplings between the thermal, hydrological, chemical, and mechanical (THMC) processes have not been fully represented in the models. Here we present a fully coupled THMC simulation study of a generic nuclear waste repository in a clay formation with a bentonite-backfilled EBS. Two scenarios were simulated for comparison: a case in which the temperature in the bentonite near the waste canister can reach about 200 oC and a case in which the temperature in the bentonite near the waste canister peaks at about 100 oC. The model simulations demonstrate that illitization is in general more significant under higher temperature. However, the quantity of illitization is affected by many chemical factors and therefore varies a great deal. The most important chemical factors are the concentration of K in the pore water as well as the abundance and dissolution rate of K-feldspar. For the particular case and bentonite properties studied, the reduction in swelling stress as a result of chemical changes vary from 2% up to 70% depending on chemical and temperature conditions, and key mechanical parameters. The modeling work is illustrative in light of the relative importance of different processes occurring in EBS bentonite and clay host rock at higher than 100 oC conditions, and could be of greater use when site specific data are available.
Ragoussi, Maria-Eleni; Costa, Davide
2017-03-14
For the last 30 years, the NEA Thermochemical Database (TDB) Project (www.oecd-nea.org/dbtdb/) has been developing a chemical thermodynamic database for elements relevant to the safety of radioactive waste repositories, providing data that are vital to support the geochemical modeling of such systems. The recommended data are selected on the basis of strict review procedures and are characterized by their consistency. The results of these efforts are freely available, and have become an international point of reference in the field. As a result, a number of important national initiatives with regard to waste management programs have used the NEA TDB as their basis, both in terms of recommended data and guidelines. In this article we describe the fundamentals and achievements of the project together with the characteristics of some databases developed in national nuclear waste disposal programs that have been influenced by the NEA TDB. We also give some insights on how this work could be seen as an approach to be used in broader areas of environmental interest. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Puranen, Anders; Jonsson, Mats; Dähn, Rainer; Cui, Daqing
2009-08-01
In proposed high level radioactive waste repositories a large part of the spent nuclear fuel (SNF) canisters are commonly composed of iron. Selenium is present in spent nuclear fuel as a long lived fission product. This study investigates the influence of iron on the uptake of dissolved selenium in the form of selenate and the effect of the presence of dissolved uranyl on the above interaction of selenate. The iron oxide, and selenium speciation on the surfaces was investigated by Raman spectroscopy. X-ray Absorption Spectroscopy was used to determine the oxidation state of the selenium and uranium on the surfaces. Under the simulated groundwater conditions (10 mM NaCl, 2 mM NaHCO 3, <0.1 ppm O 2) the immobilized selenate was found to be reduced to oxidation states close to zero or lower and uranyl was found to be largely reduced to U(IV). The near simultaneous reduction of uranyl was found to greatly enhance the rate of selenate reduction. These findings suggest that the presence of uranyl being reduced by an iron surface could substantially enhance the rate of reduction of selenate under anoxic conditions relevant for a repository.
ProtaBank: A repository for protein design and engineering data.
Wang, Connie Y; Chang, Paul M; Ary, Marie L; Allen, Benjamin D; Chica, Roberto A; Mayo, Stephen L; Olafson, Barry D
2018-03-25
We present ProtaBank, a repository for storing, querying, analyzing, and sharing protein design and engineering data in an actively maintained and updated database. ProtaBank provides a format to describe and compare all types of protein mutational data, spanning a wide range of properties and techniques. It features a user-friendly web interface and programming layer that streamlines data deposition and allows for batch input and queries. The database schema design incorporates a standard format for reporting protein sequences and experimental data that facilitates comparison of results across different data sets. A suite of analysis and visualization tools are provided to facilitate discovery, to guide future designs, and to benchmark and train new predictive tools and algorithms. ProtaBank will provide a valuable resource to the protein engineering community by storing and safeguarding newly generated data, allowing for fast searching and identification of relevant data from the existing literature, and exploring correlations between disparate data sets. ProtaBank invites researchers to contribute data to the database to make it accessible for search and analysis. ProtaBank is available at https://protabank.org. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
myExperiment: a repository and social network for the sharing of bioinformatics workflows
Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David
2010-01-01
myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605
Software for Sharing and Management of Information
NASA Technical Reports Server (NTRS)
Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.
2003-01-01
DIAMS is a set of computer programs that implements a system of collaborative agents that serve multiple, geographically distributed users communicating via the Internet. DIAMS provides a user interface as a Java applet that runs on each user s computer and that works within the context of the user s Internet-browser software. DIAMS helps all its users to manage, gain access to, share, and exchange information in databases that they maintain on their computers. One of the DIAMS agents is a personal agent that helps its owner find information most relevant to current needs. It provides software tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Capabilities for generating flexible hierarchical displays are integrated with capabilities for indexed- query searching to support effective access to information. Automatic indexing methods are employed to support users queries and communication between agents. The catalog of a repository is kept in object-oriented storage to facilitate sharing of information. Collaboration between users is aided by matchmaker agents and by automated exchange of information. The matchmaker agents are designed to establish connections between users who have similar interests and expertise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nutt, M.; Nuclear Engineering Division
2010-05-25
The activity of Phase I of the Waste Management Working Group under the United States - Japan Joint Nuclear Energy Action Plan started in 2007. The US-Japan JNEAP is a bilateral collaborative framework to support the global implementation of safe, secure, and sustainable, nuclear fuel cycles (referred to in this document as fuel cycles). The Waste Management Working Group was established by strong interest of both parties, which arise from the recognition that development and optimization of waste management and disposal system(s) are central issues of the present and future nuclear fuel cycles. This report summarizes the activity of themore » Waste Management Working Group that focused on consolidation of the existing technical basis between the U.S. and Japan and the joint development of a plan for future collaborative activities. Firstly, the political/regulatory frameworks related to nuclear fuel cycles in both countries were reviewed. The various advanced fuel cycle scenarios that have been considered in both countries were then surveyed and summarized. The working group established the working reference scenario for the future cooperative activity that corresponds to a fuel cycle scenario being considered both in Japan and the U.S. This working scenario involves transitioning from a once-through fuel cycle utilizing light water reactors to a one-pass uranium-plutonium fuel recycle in light water reactors to a combination of light water reactors and fast reactors with plutonium, uranium, and minor actinide recycle, ultimately concluding with multiple recycle passes primarily using fast reactors. Considering the scenario, current and future expected waste streams, treatment and inventory were discussed, and the relevant information was summarized. Second, the waste management/disposal system optimization was discussed. Repository system concepts were reviewed, repository design concepts for the various classifications of nuclear waste were summarized, and the factors to consider in repository design and optimization were then discussed. Japan is considering various alternatives and options for the geologic disposal facility and the framework for future analysis of repository concepts was discussed. Regarding the advanced waste and storage form development, waste form technologies developed in both countries were surveyed and compared. Potential collaboration areas and activities were next identified. Disposal system optimization processes and techniques were reviewed, and factors to consider in future repository design optimization activities were also discussed. Then the potential collaboration areas and activities related to the optimization problem were extracted.« less
10 CFR 60.73 - Reports of deficiencies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Reports of deficiencies. 60.73 Section 60.73 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Records, Reports, Tests, and Inspections § 60.73 Reports of deficiencies. DOE shall promptly...
Experiments and Modeling to Support Field Test Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Peter Jacob; Bourret, Suzanne Michelle; Zyvoloski, George Anthony
Disposition of heat-generating nuclear waste (HGNW) remains a continuing technical and sociopolitical challenge. We define HGNW as the combination of both heat generating defense high level waste (DHLW) and civilian spent nuclear fuel (SNF). Numerous concepts for HGNW management have been proposed and examined internationally, including an extensive focus on geologic disposal (c.f. Brunnengräber et al., 2013). One type of proposed geologic material is salt, so chosen because of its viscoplastic deformation that causes self-repair of damage or deformation induced in the salt by waste emplacement activities (Hansen and Leigh, 2011). Salt as a repository material has been tested atmore » several sites around the world, notably the Morsleben facility in Germany (c.f. Fahland and Heusermann, 2013; Wollrath et al., 2014; Fahland et al., 2015) and at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, NM. Evaluating the technical feasibility of a HGNW repository in salt is an ongoing process involving experiments and numerical modeling of many processes at many facilities.« less
In vivo antiplasmodial activities of ethanolic extract and fractions of Eleucine indica.
Ettebong, E O; Nwafor, P A; Okokon, J E
2012-09-01
To evaluate the in vivo antiplasmodial activities of the extract and fractions (n-hexane, chloroform, ethylacetate, butanol, aqueous) of the whole plant in Plasmodium berghei berghei infected mice. Oral administrations of the extract (200, 400, and 600 mg/kg) of Eleucine indica and fractions (400 mg/kg) were screened in the 4-day, repository and curative tests. Chloroquine (5 mg/kg), pyrimethamine (1.2 mg/kg) and artesunate (5 mg/kg) were used as controls. The extract showed significant (P< 0.05-0.001) dose-dependent, antiplasmodial activity in the 4-day, repository and curative tests and increased the survival times of the infected mice. All the fractions exhibited significant antiplasmodial activity with the highest being ethylacetate fraction. Eleucine indica extract and fractions possess antimalarial activity which confirms the ethnobotanical use of this plant as a malarial remedy and opens a new highway to further investigate its potentials in the on-going fight against malaria. Copyright © 2012 Hainan Medical College. Published by Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-12
... 79765] Public Land Order No. 7742; Withdrawal of Public Land for the Manning Canyon Tailings Repository... period of 5 years to protect the integrity of the Manning Canyon Tailings Repository and surrounding... Repository. The Bureau of Land Management intends to evaluate the need for a lengthier withdrawal through the...
ERIC Educational Resources Information Center
Association of Research Libraries, 2009
2009-01-01
Libraries are making diverse contributions to the development of many types of digital repositories, particularly those housing locally created digital content, including new digital objects or digitized versions of locally held works. In some instances, libraries are managing a repository and its related services entirely on their own, but often…
ERIC Educational Resources Information Center
King, Melanie; Loddington, Steve; Manuel, Sue; Oppenheim, Charles
2008-01-01
The last couple of years have brought a rise in the number of institutional repositories throughout the world and within UK Higher Education institutions, with the majority of these repositories being devoted to research output. Repositories containing teaching and learning material are less common and the workflows and business processes…
Integrating XQuery-Enabled SCORM XML Metadata Repositories into an RDF-Based E-Learning P2P Network
ERIC Educational Resources Information Center
Qu, Changtao; Nejdl, Wolfgang
2004-01-01
Edutella is an RDF-based E-Learning P2P network that is aimed to accommodate heterogeneous learning resource metadata repositories in a P2P manner and further facilitate the exchange of metadata between these repositories based on RDF. Whereas Edutella provides RDF metadata repositories with a quite natural integration approach, XML metadata…
Scaling an expert system data mart: more facilities in real-time.
McNamee, L A; Launsby, B D; Frisse, M E; Lehmann, R; Ebker, K
1998-01-01
Clinical Data Repositories are being rapidly adopted by large healthcare organizations as a method of centralizing and unifying clinical data currently stored in diverse and isolated information systems. Once stored in a clinical data repository, healthcare organizations seek to use this centralized data to store, analyze, interpret, and influence clinical care, quality and outcomes. A recent trend in the repository field has been the adoption of data marts--specialized subsets of enterprise-wide data taken from a larger repository designed specifically to answer highly focused questions. A data mart exploits the data stored in the repository, but can use unique structures or summary statistics generated specifically for an area of study. Thus, data marts benefit from the existence of a repository, are less general than a repository, but provide more effective and efficient support for an enterprise-wide data analysis task. In previous work, we described the use of batch processing for populating data marts directly from legacy systems. In this paper, we describe an architecture that uses both primary data sources and an evolving enterprise-wide clinical data repository to create real-time data sources for a clinical data mart to support highly specialized clinical expert systems.
Schematic designs for penetration seals for a reference repository in bedded salt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelsall, P.C.; Case, J.B.; Meyer, D.
1982-11-01
The isolation of radioactive wastes in geologic repositories requires that man-made penetrations such as shafts, tunnels, or boreholes are adequately sealed. This report describes schematic seal designs for a repository in bedded salt referenced to the straitigraphy of southeastern New Mexico. The designs are presented for extensive peer review and will be updated as site-specific conceptual designs when a site for a repository in salt has been selected. The principal material used in the seal system is crushed salt obtained from excavating the repository. It is anticipated that crushed salt will consolidate as the repository rooms creep close to themore » degree that mechanical and hydrologic properties will eventually match those of undisturbed, intact salt. For southeastern New Mexico salt, analyses indicate that this process will require approximately 1000 years for a seal located at the base of one of the repository shafts (where there is little increase in temperature due to waste emplacement) and approximately 400 years for a seal located in an access tunnel within the repository. Bulkheads composed of contrete or salt bricks are also included in the seal system as components which will have low permeability during the period required for salt consolidation.« less
NASA Astrophysics Data System (ADS)
Mon, Alba; Samper, Javier; Montenegro, Luis; Naves, Acacia; Fernández, Jesús
2017-02-01
Radioactive waste disposal in deep geological repositories envisages engineered barriers such as carbon-steel canisters, compacted bentonite and concrete liners. The stability and performance of the bentonite barrier could be affected by the corrosion products at the canister-bentonite interface and the hyper-alkaline conditions caused by the degradation of concrete at the bentonite-concrete interface. Additionally, the host clay formation could also be affected by the hyper-alkaline plume at the concrete-clay interface. Here we present a non-isothermal multicomponent reactive transport model of the long-term (1 Ma) interactions of the compacted bentonite with the corrosion products of a carbon-steel canister and the concrete liner of the engineered barrier of a high-level radioactive waste repository in clay. Model results show that magnetite is the main corrosion product. Its precipitation reduces significantly the porosity of the bentonite near the canister. The degradation of the concrete liner leads to the precipitation of secondary minerals and the reduction of the porosity of the bentonite and the clay formation at their interfaces with the concrete liner. The reduction of the porosity becomes especially relevant at t = 104 years. The zones affected by pore clogging at the canister-bentonite and concrete-clay interfaces at 1 Ma are approximately equal to 1 and 3.3 cm thick, respectively. The hyper-alkaline front (pH > 8.5) spreads 2.5 cm into the clay formation after 1 Ma. Our simulation results share the key features of the models reported by others for engineered barrier systems at similar chemical conditions, including: 1) Pore clogging at the canister-bentonite and concrete-clay interfaces; 2) Narrow alteration zones; and 3) Limited smectite dissolution after 1 Ma.
Characterize Eruptive Processes at Yucca Mountain, Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Valentine
2001-12-20
This Analysis/Model Report (AMR), ''Characterize Eruptive Processes at Yucca Mountain, Nevada'', presents information about natural volcanic systems and the parameters that can be used to model their behavior. This information is used to develop parameter-value distributions appropriate for analysis of the consequences of volcanic eruptions through a potential repository at Yucca Mountain. Many aspects of this work are aimed at resolution of the Igneous Activity Key Technical Issue (KTI) as identified by the Nuclear Regulatory Commission (NRC 1998, p. 3), Subissues 1 and 2, which address the probability and consequence of igneous activity at the proposed repository site, respectively. Withinmore » the framework of the Disruptive Events Process Model Report (PMR), this AMR provides information for the calculations in two other AMRs ; parameters described herein are directly used in calculations in these reports and will be used in Total System Performance Assessment (TSPA). Compilation of this AMR was conducted as defined in the Development Plan, except as noted. The report begins with considerations of the geometry of volcanic feeder systems, which are of primary importance in predicting how much of a potential repository would be affected by an eruption. This discussion is followed by one of the physical and chemical properties of the magmas, which influences both eruptive styles and mechanisms for interaction with radioactive waste packages. Eruptive processes including the ascent velocity of magma at depth, the onset of bubble nucleation and growth in the rising magmas, magma fragmentation, and velocity of the resulting gas-particle mixture are then discussed. The duration of eruptions, their power output, and mass discharge rates are also described. The next section summarizes geologic constraints regarding the interaction between magma and waste packages. Finally, they discuss bulk grain size produced by relevant explosive eruptions and grain shapes.« less
Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine
2018-01-01
Background Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. Objective The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. Methods The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Results Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. Conclusions MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians’ skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. PMID:29720361
Mon, Alba; Samper, Javier; Montenegro, Luis; Naves, Acacia; Fernández, Jesús
2017-02-01
Radioactive waste disposal in deep geological repositories envisages engineered barriers such as carbon-steel canisters, compacted bentonite and concrete liners. The stability and performance of the bentonite barrier could be affected by the corrosion products at the canister-bentonite interface and the hyper-alkaline conditions caused by the degradation of concrete at the bentonite-concrete interface. Additionally, the host clay formation could also be affected by the hyper-alkaline plume at the concrete-clay interface. Here we present a non-isothermal multicomponent reactive transport model of the long-term (1Ma) interactions of the compacted bentonite with the corrosion products of a carbon-steel canister and the concrete liner of the engineered barrier of a high-level radioactive waste repository in clay. Model results show that magnetite is the main corrosion product. Its precipitation reduces significantly the porosity of the bentonite near the canister. The degradation of the concrete liner leads to the precipitation of secondary minerals and the reduction of the porosity of the bentonite and the clay formation at their interfaces with the concrete liner. The reduction of the porosity becomes especially relevant at t=10 4 years. The zones affected by pore clogging at the canister-bentonite and concrete-clay interfaces at 1Ma are approximately equal to 1 and 3.3cm thick, respectively. The hyper-alkaline front (pH>8.5) spreads 2.5cm into the clay formation after 1Ma. Our simulation results share the key features of the models reported by others for engineered barrier systems at similar chemical conditions, including: 1) Pore clogging at the canister-bentonite and concrete-clay interfaces; 2) Narrow alteration zones; and 3) Limited smectite dissolution after 1Ma. Copyright © 2016 Elsevier B.V. All rights reserved.
Becker, J K; Lindborg, T; Thorne, M C
2014-12-01
In safety assessments of repositories for radioactive wastes, large spatial and temporal scales have to be considered when developing an approach to risk calculations. A wide range of different types of information may be required. Local to the site of interest, temperature and precipitation data may be used to determine the erosional regime (which may also be conditioned by the vegetation characteristics adopted, based both on climatic and other considerations). However, geomorphological changes may be governed by regional rather than local considerations, e.g. alteration of river base levels, river capture and drainage network reorganisation, or the progression of an ice sheet or valley glacier across the site. The regional climate is in turn governed by the global climate. In this work, a commentary is presented on the types of climate models that can be used to develop projections of climate change for use in post-closure radiological impact assessments of geological repositories for radioactive wastes. These models include both Atmosphere-Ocean General Circulation Models and Earth Models of Intermediate Complexity. The relevant outputs available from these models are identified and consideration is given to how these outputs may be used to inform projections of landscape development. Issues of spatial and temporal downscaling of climate model outputs to meet the requirements of local-scale landscape development modelling are also addressed. An example is given of how climate change and landscape development influence the radiological impact of radionuclides potentially released from the deep geological disposal facility for spent nuclear fuel that SKB (the Swedish Nuclear Fuel and Waste Management Company) proposes to construct at Forsmark, Sweden. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rysavy, Steven J; Beck, David A C; Daggett, Valerie
2014-11-01
Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼ 25-75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. © 2014 The Protein Society.
Rysavy, Steven J; Beck, David AC; Daggett, Valerie
2014-01-01
Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼25–75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. PMID:25142412
Cimino, James J; Lancaster, William J; Wyatt, Mathew C
2017-01-01
One of the challenges to using electronic health record (EHR) repositories for research is the difficulty mapping study subject eligibility criteria to the query capabilities of the repository. We sought to characterize criteria as "easy" (searchable in a typical repository), "hard" (requiring manual review of the record data), and "impossible" (not typically available in EHR repositories). We obtained 292 criteria from 20 studies available from Clinical Trials.gov and rated them according to our three types, plus a fourth "mixed" type. We had good agreement among three independent reviewers and chose 274 criteria that were characterized by single types for further analysis. The resulting analysis showed typical features of criteria that do and don't map to repositories. We propose that these features be used to guide researchers in specifying eligibility criteria to improve development of enrollment workflow, including the definition of EHR repository queries for self-service or analyst-mediated retrievals.
Making Research Data Repositories Visible: The re3data.org Registry
Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe
2013-01-01
Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org–Registry of Research Data Repositories–has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data. PMID:24223762
Global maps of non-traumatic spinal cord injury epidemiology: towards a living data repository.
New, P W; Cripps, R A; Bonne Lee, B
2014-02-01
Literature review. Globally map non-traumatic spinal cord injury (NTSCI) incidence, prevalence, survival, level of injury and aetiology. Propose a research framework for NTSCI prevention and launch a repository of NTSCI data. Initiative of the International Spinal Cord Society Prevention Committee. Literature search of Medline and Embase (1959-June 2011). Relevant articles in any language regarding adults with NTSCI were included. Stratification of information about incidence and prevalence into green/yellow/orange/red data quality 'zones' and comparisons between World Health Organisation (WHO) regions and countries. Three hundred and seventy-seven abstracts reviewed--45 of these from 24 countries in 12 of the 21 WHO global regions had relevant information. Only one publication had survival data. Prevalence data for NTSCI existed for only two countries, India (prevalence of 2,310/million population, Kashmir region) and Canada (prevalence of 1,120/million population). The incidence rates for WHO regions were: Asia Pacific, high income 20/million population/year; Australasia (26/million population/year); Western Europe median of 6/million population/year; North America, high income median 76/million population/year (based on poor-quality studies); and Oceania 9/million population/year. Developed countries tended to have a higher proportion of cases with degenerative conditions and tumours. Developing countries, in comparison, tended to have a higher proportion of infections, particularly tuberculosis and HIV, although a number also reported tumours as a major cause. Insufficient survival, prevalence and incidence data are a predominant finding of this review. The piecemeal approach to epidemiological reporting of NTSCI, particularly failing to include sound regional population denominators, has exhausted its utility. Minimum data collection standards are required.
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.
Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques
NASA Astrophysics Data System (ADS)
Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel
Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.
Overview of actinide chemistry in the WIPP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borkowski, Marian; Lucchini, Jean - Francois; Richmann, Michael K
2009-01-01
The year 2009 celebrates 10 years of safe operations at the Waste Isolation Pilot Plant (WIPP), the only nuclear waste repository designated to dispose defense-related transuranic (TRU) waste in the United States. Many elements contributed to the success of this one-of-the-kind facility. One of the most important of these is the chemistry of the actinides under WIPP repository conditions. A reliable understanding of the potential release of actinides from the site to the accessible environment is important to the WIPP performance assessment (PA). The environmental chemistry of the major actinides disposed at the WIPP continues to be investigated as partmore » of the ongoing recertification efforts of the WIPP project. This presentation provides an overview of the actinide chemistry for the WIPP repository conditions. The WIPP is a salt-based repository; therefore, the inflow of brine into the repository is minimized, due to the natural tendency of excavated salt to re-seal. Reducing anoxic conditions are expected in WIPP because of microbial activity and metal corrosion processes that consume the oxygen initially present. Should brine be introduced through an intrusion scenario, these same processes will re-establish reducing conditions. In the case of an intrusion scenario involving brine, the solubilization of actinides in brine is considered as a potential source of release to the accessible environment. The following key factors establish the concentrations of dissolved actinides under subsurface conditions: (1) Redox chemistry - The solubility of reduced actinides (III and IV oxidation states) is known to be significantly lower than the oxidized forms (V and/or VI oxidation states). In this context, the reducing conditions in the WIPP and the strong coupling of the chemistry for reduced metals and microbiological processes with actinides are important. (2) Complexation - For the anoxic, reducing and mildly basic brine systems in the WIPP, the most important inorganic complexants are expected to be carbonate/bicarbonate and hydroxide. There are also organic complexants in TRU waste with the potential to strongly influence actinide solubility. (3) Intrinsic and pseudo-actinide colloid formation - Many actinide species in their expected oxidation states tend to form colloids or strongly associate with non actinide colloids present (e.g., microbial, humic and organic). In this context, the relative importance of actinides, based on the TRU waste inventory, with respect to the potential release of actinides from the WIPP, is greater for plutonium and americium, and to less extent for uranium and thorium. The most important oxidation states for WIPP-relevant conditions are III and IV. We will present an update of the literature on WIPP-specific data, and a summary of the ongoing research related to actinide chemistry in the WIPP performed by the Los Alamos National Laboratory (LANL) Actinide Chemistry and Repository Science (ACRSP) team located in Carlsbad, NM [Reed 2007, Lucchini 2007, and Reed 2006].« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogue, F.; Binnall, E.P.
1982-10-01
Reliable instrumentation will be needed to monitor the performance of future high-level waste repository sites. A study has been made to assess instrument reliability at Department of Energy (DOE) waste repository related experiments. Though the study covers a wide variety of instrumentation, this paper concentrates on experiences with geotechnical instrumentation in hostile repository-type environments. Manufacturers have made some changes to improve the reliability of instruments for repositories. This paper reviews the failure modes, rates, and mechanisms, along with manufacturer modifications and recommendations for additional improvements to enhance instrument performance. 4 tables.
A Data Repository and Visualization Toolbox for Metabolic Pathways and PBPK parameter prediction
NHANES is an extensive, well-structured collection of data about hundreds chemicals products of human metabolism and their concentration in human biomarkers, which includes parent to product mapping where known. Together, these data can be used to test the efficacy of application...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Inspections. 60.75 Section 60.75 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Records, Reports, Tests, and Inspections § 60.75 Inspections. (a) DOE shall allow the Commission to inspect the premises...
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
NASA Astrophysics Data System (ADS)
Dunagan, S. C.; Herrick, C. G.; Lee, M. Y.
2008-12-01
The Waste Isolation Pilot Plant (WIPP) is located at a depth of 655 m in bedded salt in southeastern New Mexico and is operated by the U.S. Department of Energy as a deep underground disposal facility for transuranic (TRU) waste. The WIPP must comply with the EPA's environmental regulations that require a probabilistic risk analysis of releases of radionuclides due to inadvertent human intrusion into the repository at some time during the 10,000-year regulatory period. Sandia National Laboratories conducts performance assessments (PAs) of the WIPP using a system of computer codes representing the evolution of underground repository and emplaced TRU waste in order to demonstrate compliance. One of the important features modeled in a PA is the disturbed rock zone (DRZ) surrounding the emplacement rooms in the repository. The extent and permeability of DRZ play a significant role in the potential radionuclide release scenarios. We evaluated the phenomena occurring in the repository that affect the DRZ and their potential effects on the extent and permeability of the DRZ. Furthermore, we examined the DRZ's role in determining the performance of the repository. Pressure in the completely sealed repository will be increased by creep closure of the salt and degradation of TRU waste contents by microbial activity in the repository. An increased pressure in the repository will reduce the extent and permeability of the DRZ. The reduced DRZ extent and permeability will decrease the amount of brine that is available to interact with the waste. Furthermore, the potential for radionuclide release from the repository is dependent on the amount of brine that enters the repository. As a result of these coupled biological-geomechanical-geochemical phenomena, the extent and permeability of the DRZ has a significant impact on the potential radionuclide releases from the repository and, in turn, the repository performance. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.
Burchill, C; Roos, L L; Fergusson, P; Jebamani, L; Turner, K; Dueck, S
2000-01-01
Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research Methods, and facilitate both internal communication and collaboration with other sites. This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated.
Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen
2000-01-01
Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929
Repository Profiles for Atmospheric and Climate Sciences: Capabilities and Trends in Data Services
NASA Astrophysics Data System (ADS)
Hou, C. Y.; Thompson, C. A.; Palmer, C. L.
2014-12-01
As digital research data proliferate and expectations for open access escalate, the landscape of data repositories is becoming more complex. For example, DataBib currently identifies 980 data repositories across the disciplines, with 117 categorized under Geosciences. In atmospheric and climate sciences, there are great expectations for the integration and reuse of data for advancing science. To realize this potential, resources are needed that explicate the range of repository options available for locating and depositing open data, their conditions of access and use, and the services and tools they provide. This study profiled 38 open digital repositories in the atmospheric and climate sciences, analyzing each on 55 criteria through content analysis of their websites. The results provide a systematic way to assess and compare capabilities, services, and institutional characteristics and identify trends across repositories. Selected results from the more detailed outcomes to be presented: Most repositories offer guidance on data format(s) for submission and dissemination. 42% offer authorization-free access. More than half use some type of data identification system such as DOIs. Nearly half offer some data processing, with a similar number providing software or tools. 78.9% request that users cite or acknowledge datasets used and the data center. Only 21.1% recommend specific metadata standards, such as ISO 19115 or Dublin Core, with more than half utilizing a customized metadata scheme. Information was rarely provided on repository certification and accreditation and uneven for transfer of rights and data security. Few provided policy information on preservation, migration, reappraisal, disposal, or long-term sustainability. As repository use increases, it will be important for institutions to make their procedures and policies explicit, to build trust with user communities and improve efficiencies in data sharing. Resources such as repository profiles will be essential for scientists to weigh options and understand trends in data services across the evolving network of repositories.
Corticotropin, Repository Injection
Corticotropin repository injection is used to treat the following conditions:infantile spasms (seizures that usually begin during the first ... of the arms, hands, feet, and legs). Corticotropin repository injection is in a class of medications called ...
10 CFR 60.134 - Design of seals for shafts and boreholes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60... the geologic repository's ability to meet the performance objectives or the period following permanent...
10 CFR 960.5-2 - Technical guidelines.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines § 960.5-2 Technical guidelines. The technical guidelines in this subpart set... repository and to the transportation of waste to a repository site. The third group includes conditions on...
The Function Biomedical Informatics Research Network Data Repository
Keator, David B.; van Erp, Theo G.M.; Turner, Jessica A.; Glover, Gary H.; Mueller, Bryon A.; Liu, Thomas T.; Voyvodic, James T.; Rasmussen, Jerod; Calhoun, Vince D.; Lee, Hyo Jong; Toga, Arthur W.; McEwen, Sarah; Ford, Judith M.; Mathalon, Daniel H.; Diaz, Michele; O’Leary, Daniel S.; Bockholt, H. Jeremy; Gadde, Syam; Preda, Adrian; Wible, Cynthia G.; Stern, Hal S.; Belger, Aysenil; McCarthy, Gregory; Ozyurt, Burak; Potkin, Steven G.
2015-01-01
The Function Biomedical Informatics Research Network (FBIRN) developed methods and tools for conducting multi-scanner functional magnetic resonance imaging (fMRI) studies. Method and tool development were based on two major goals: 1) to assess the major sources of variation in fMRI studies conducted across scanners, including instrumentation, acquisition protocols, challenge tasks, and analysis methods, and 2) to provide a distributed network infrastructure and an associated federated database to host and query large, multi-site, fMRI and clinical datasets. In the process of achieving these goals the FBIRN test bed generated several multi-scanner brain imaging data sets to be shared with the wider scientific community via the BIRN Data Repository (BDR). The FBIRN Phase 1 dataset consists of a traveling subject study of 5 healthy subjects, each scanned on 10 different 1.5 to 4 Tesla scanners. The FBIRN Phase 2 and Phase 3 datasets consist of subjects with schizophrenia or schizoaffective disorder along with healthy comparison subjects scanned at multiple sites. In this paper, we provide concise descriptions of FBIRN’s multi-scanner brain imaging data sets and details about the BIRN Data Repository instance of the Human Imaging Database (HID) used to publicly share the data. PMID:26364863
Assessing repository technology. Where do we go from here?
NASA Technical Reports Server (NTRS)
Eichmann, David
1992-01-01
Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as prospective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.
Assessing repository technology: Where do we go from here?
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as perspective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.
USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15
2017-05-31
AFRL-SA-WP-SR-2017-0014 USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 Daniel A. Williams...Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...Health Readiness System-Hearing Conservation Data Repository (DOEHRS-HC DR). Major command- and installation-level reports are available quarterly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broxton, D.E.; Chipera, S.J.; Byers, F.M. Jr.
1993-10-01
Outcrops of nonwelded tuff at six locations in the vicinity of Yucca Mountain, Nevada, were examined to determine their suitability for hosting a surface-based test facility for the Yucca Mountain Project. Investigators will use this facility to test equipment and procedures for the Exploratory Studies Facility and to conduct site characterization field experiments. The outcrops investigated contain rocks that include or are similar to the tuffaceous beds of Calico Hills, an important geologic and hydrologic barrier between the potential repository and the water table. The tuffaceous beds of Calico Hills at the site of the potential repository consist of bothmore » vitric and zeolitic tuffs, thus three of the outcrops examined are vitric tuffs and three are zeolitic tuffs. New data were collected to determine the lithology, chemistry, mineralogy, and modal petrography of the outcrops. Some preliminary data on hydrologic properties are also presented. Evaluation of suitability of the six sites is based on a comparison of their geologic characteristics to those found in the tuffaceous beds of Calico Hills within the exploration block.« less
Methods for pore water extraction from unsaturated zone tuff, Yucca Mountain, Nevada
Scofield, K.M.
2006-01-01
Assessing the performance of the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, requires an understanding of the chemistry of the water that moves through the host rock. The uniaxial compression method used to extract pore water from samples of tuffaceous borehole core was successful only for nonwelded tuff. An ultracentrifugation method was adopted to extract pore water from samples of the densely welded tuff of the proposed repository horizon. Tests were performed using both methods to determine the efficiency of pore water extraction and the potential effects on pore water chemistry. Test results indicate that uniaxial compression is most efficient for extracting pore water from nonwelded tuff, while ultracentrifugation is more successful in extracting pore water from densely welded tuff. Pore water splits collected from a single nonwelded tuff core during uniaxial compression tests have shown changes in pore water chemistry with increasing pressure for calcium, chloride, sulfate, and nitrate. Pore water samples collected from the intermediate pressure ranges should prevent the influence of re-dissolved, evaporative salts and the addition of ion-deficient water from clays and zeolites. Chemistry of pore water splits from welded and nonwelded tuffs using ultracentrifugation indicates that there is no substantial fractionation of solutes.
10 CFR 960.3-4 - Environmental impacts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Implementation Guidelines § 960.3-4 Environmental impacts. Environmental impacts shall be considered by the DOE throughout the site characterization, site selection, and repository development..., during site characterization and repository construction, operation, closure, and decommissioning. ...
NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF
Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E.; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P.; Ascoli, Giorgio A.
2009-01-01
Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level. PMID:18949582
NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF.
Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P; Ascoli, Giorgio A
2008-09-01
Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level.
NASA Astrophysics Data System (ADS)
Myrbo, A.; Loeffler, S.; Ai, S.; McEwan, R.
2015-12-01
The ultimate EarthCube product has been described as a mobile app that provides all of the known geoscience data for a geographic point or polygon, from the top of the atmosphere to the core of the Earth, throughout geologic time. The database queries are hidden from the user, and the data are visually rendered for easy recognition of patterns and associations. This fanciful vision is not so remote: NSF EarthCube and Geoinformatics support has already fostered major advances in database interoperability and harmonization of APIs; numerous "domain repositories," databases curated by subject matter experts, now provide a vast wealth of open, easily-accessible georeferenced data on rock and sediment chemistry and mineralogy, paleobiology, stratigraphy, rock magnetics, and more. New datasets accrue daily, including many harvested from the literature by automated means. None of these constitute big data - all are part of the long tail of geoscience, heterogeneous data consisting of relatively small numbers of measurements made by a large number of people, typically on physical samples. This vision of mobile data discovery requires a software package to cleverly expose these domain repositories' holdings; currently, queries mainly come from single investigators to single databases. The NSF-funded mobile app Flyover Country (FC; fc.umn.edu), developed for geoscience outreach and education, has been welcomed by data curators and cyberinfrastructure developers as a testing ground for their API services, data provision, and scalability. FC pulls maps and data within a bounding envelope and caches them for offline use; location-based services alert users to nearby points of interest (POI). The incorporation of data from multiple databases across domains requires parsimonious data requests and novel visualization techniques, especially for mapping of data with a time or stratigraphic depth component. The preservation of data provenance and authority is critical for researcher buy-in to all community databases, and further allows exploration and suggestions of collaborators, based upon geography and topical relevance.
Karvounis, E C; Exarchos, T P; Fotiou, E; Sakellarios, A I; Iliopoulou, D; Koutsouris, D; Fotiadis, D I
2013-01-01
With an ever increasing number of biological models available on the internet, a standardized modelling framework is required to allow information to be accessed and visualized. In this paper we propose a novel Extensible Markup Language (XML) based format called ART-ML that aims at supporting the interoperability and the reuse of models of geometry, blood flow, plaque progression and stent modelling, exported by any cardiovascular disease modelling software. ART-ML has been developed and tested using ARTool. ARTool is a platform for the automatic processing of various image modalities of coronary and carotid arteries. The images and their content are fused to develop morphological models of the arteries in 3D representations. All the above described procedures integrate disparate data formats, protocols and tools. ART-ML proposes a representation way, expanding ARTool, for interpretability of the individual resources, creating a standard unified model for the description of data and, consequently, a format for their exchange and representation that is machine independent. More specifically, ARTool platform incorporates efficient algorithms which are able to perform blood flow simulations and atherosclerotic plaque evolution modelling. Integration of data layers between different modules within ARTool are based upon the interchange of information included in the ART-ML model repository. ART-ML provides a markup representation that enables the representation and management of embedded models within the cardiovascular disease modelling platform, the storage and interchange of well-defined information. The corresponding ART-ML model incorporates all relevant information regarding geometry, blood flow, plaque progression and stent modelling procedures. All created models are stored in a model repository database which is accessible to the research community using efficient web interfaces, enabling the interoperability of any cardiovascular disease modelling software models. ART-ML can be used as a reference ML model in multiscale simulations of plaque formation and progression, incorporating all scales of the biological processes.
Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit
NASA Technical Reports Server (NTRS)
Penn, John
2014-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.
Grid Modernization Laboratory Consortium - Testing and Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroposki, Benjamin; Skare, Paul; Pratt, Rob
This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.
2000-05-22
Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequentmore » to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.« less
10 CFR 60.78 - Material control and accounting records and reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Material control and accounting records and reports. 60.78 Section 60.78 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Records, Reports, Tests, and Inspections § 60.78 Material control and...
Implementation of Citrus Shoot Tip Cryopreservation in the USDA-ARS National Plant Germplasm System
USDA-ARS?s Scientific Manuscript database
The USDA-ARS National Plant Germplasm System (NPGS) maintains 540 Citrus cultivars and crop wild relatives as duplicate clones in a screenhouse at the National Clonal Germplasm Repository for Citrus and Dates (NCGRCD) in Riverside, California. These 540 accessions are pathogen-tested and apparently ...
10 CFR 60.72 - Construction records.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Construction records. 60.72 Section 60.72 Energy NUCLEAR..., Reports, Tests, and Inspections § 60.72 Construction records. (a) DOE shall maintain records of construction of the geologic repository operations area in a manner that ensures their useability for future...
10 CFR 60.72 - Construction records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Construction records. 60.72 Section 60.72 Energy NUCLEAR..., Reports, Tests, and Inspections § 60.72 Construction records. (a) DOE shall maintain records of construction of the geologic repository operations area in a manner that ensures their useability for future...
DEVELOPMENT OF THE U.S. EPA HEALTH EFFECTS RESEARCH LABORATORY FROZEN BLOOD CELL REPOSITORY PROGRAM
In previous efforts, we suggested that proper blood cell freezing and storage is necessary in longitudinal studies with reduced between tests error, for specimen sharing between laboratories and for convenient scheduling of assays. e continue to develop and upgrade programs for o...
An Economical DNA Test for Genetic Identity Confirmation in Blueberry
USDA-ARS?s Scientific Manuscript database
Blueberry (Vaccinium sp.) cultivation began in the early 20th Century in the U.S. Since then it has become a major crop in North America, South America, Europe, China, Japan, Australia and New Zealand. The United States Department of Agriculture (USDA) National Clonal Germplasm Repository (NCGR) in ...
10 CFR 63.16 - Review of site characterization activities. 2
Code of Federal Regulations, 2012 CFR
2012-01-01
... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...
10 CFR 63.16 - Review of site characterization activities. 2
Code of Federal Regulations, 2011 CFR
2011-01-01
... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...
10 CFR 63.16 - Review of site characterization activities. 2
Code of Federal Regulations, 2013 CFR
2013-01-01
... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...
10 CFR 63.16 - Review of site characterization activities. 2
Code of Federal Regulations, 2010 CFR
2010-01-01
... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...
10 CFR 63.16 - Review of site characterization activities. 2
Code of Federal Regulations, 2014 CFR
2014-01-01
... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...
NASA Astrophysics Data System (ADS)
Lee, K.; Buscheck, T. A.; Glascoe, L. G.; Gansemer, J.; Sun, Y.
2002-12-01
In support of the characterization of Yucca Mountain as a potential site for as a geologic repository for high-level nuclear waste, the US Department of Energy conducted the Large Block Test (LBT) at nearby Fran Ridge. The LBT was conducted in an excavated 3x 3x 4.5m block of partially saturated, fractured nonlithophysal Topopah Spring tuff, which is one of the host-rock units for the potential repository at Yucca Mountain. The LBT was one of a series of field-scale thermohydrologic tests conducted in the repository host-rock units. The LBT was heated by line heaters installed in five boreholes lying in a horizontal plane 2.75 m below the upper surface of the block. The field-scale thermal tests were designed to help investigators better understand the coupled thermohydrologic-mechanical-chemical processes that would occur in the host rock in response to the radioactive heat of decay from emplaced waste packages. The tests also provide data for the calibration and validation of numerical models used to analyze the thermohydrologic response of the near-field host rock and Engineered Barrier System (EBS). Using the NUFT code and the dual-permeability approach to representing fracture-matrix interaction, we simulated the thermohydrologic response of the block to a heating and cooling cycle. The primary goals of the analysis were to study the heat-flow mechanisms and water redistribution patterns in the boiling and sub-boiling zones, and to compare model results with measured temperature and liquid saturation data, and thereby evaluate two rock property data sets available for modeling thermohydrologic behavior in the rock. Model results were also used for model calibration and validation. We obtained a good to excellent match between model and observed temperatures, and found that the distinct dryout and condensation zones modeled above and below the heater level agreed fairly well with the liquid-saturation measurements. We identified the best-fit data set by using a statistical analysis to compare model and field temperatures, and found that heat flow in the block was dominated by conduction.
Making metadata usable in a multi-national research setting.
Ellul, Claire; Foord, Joanna; Mooney, John
2013-11-01
SECOA (Solutions for Environmental Contrasts in Coastal Areas) is a multi-national research project examining the effects of human mobility on urban settlements in fragile coastal environments. This paper describes the setting up of a SECOA metadata repository for non-specialist researchers such as environmental scientists and tourism experts. Conflicting usability requirements of two groups - metadata creators and metadata users - are identified along with associated limitations of current metadata standards. A description is given of a configurable metadata system designed to grow as the project evolves. This work is of relevance for similar projects such as INSPIRE. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
M4FT-16LL080302052-Update to Thermodynamic Database Development and Sorption Database Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavarin, Mavrik; Wolery, T. J.; Atkins-Duffin, C.
2016-08-16
This progress report (Level 4 Milestone Number M4FT-16LL080302052) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number FT-16LL08030205. The focus of this research is the thermodynamic modeling of Engineered Barrier System (EBS) materials and properties and development of thermodynamic databases and models to evaluate the stability of EBS materials and their interactions with fluids at various physico-chemical conditions relevant to subsurface repository environments. The development and implementation of equilibrium thermodynamic models are intended to describe chemical and physical processes such as solubility, sorption, and diffusion.
A Web-Based Data-Querying Tool Based on Ontology-Driven Methodology and Flowchart-Based Model
Ping, Xiao-Ou; Chung, Yufang; Liang, Ja-Der; Yang, Pei-Ming; Huang, Guan-Tarn; Lai, Feipei
2013-01-01
Background Because of the increased adoption rate of electronic medical record (EMR) systems, more health care records have been increasingly accumulating in clinical data repositories. Therefore, querying the data stored in these repositories is crucial for retrieving the knowledge from such large volumes of clinical data. Objective The aim of this study is to develop a Web-based approach for enriching the capabilities of the data-querying system along the three following considerations: (1) the interface design used for query formulation, (2) the representation of query results, and (3) the models used for formulating query criteria. Methods The Guideline Interchange Format version 3.5 (GLIF3.5), an ontology-driven clinical guideline representation language, was used for formulating the query tasks based on the GLIF3.5 flowchart in the Protégé environment. The flowchart-based data-querying model (FBDQM) query execution engine was developed and implemented for executing queries and presenting the results through a visual and graphical interface. To examine a broad variety of patient data, the clinical data generator was implemented to automatically generate the clinical data in the repository, and the generated data, thereby, were employed to evaluate the system. The accuracy and time performance of the system for three medical query tasks relevant to liver cancer were evaluated based on the clinical data generator in the experiments with varying numbers of patients. Results In this study, a prototype system was developed to test the feasibility of applying a methodology for building a query execution engine using FBDQMs by formulating query tasks using the existing GLIF. The FBDQM-based query execution engine was used to successfully retrieve the clinical data based on the query tasks formatted using the GLIF3.5 in the experiments with varying numbers of patients. The accuracy of the three queries (ie, “degree of liver damage,” “degree of liver damage when applying a mutually exclusive setting,” and “treatments for liver cancer”) was 100% for all four experiments (10 patients, 100 patients, 1000 patients, and 10,000 patients). Among the three measured query phases, (1) structured query language operations, (2) criteria verification, and (3) other, the first two had the longest execution time. Conclusions The ontology-driven FBDQM-based approach enriched the capabilities of the data-querying system. The adoption of the GLIF3.5 increased the potential for interoperability, shareability, and reusability of the query tasks. PMID:25600078
Dermol, Urška; Kontić, Branko
2011-01-01
The benefits of strategic environmental considerations in the process of siting a repository for low- and intermediate-level radioactive waste (LILW) are presented. The benefits have been explored by analyzing differences between the two site selection processes. One is a so-called official site selection process, which is implemented by the Agency for radwaste management (ARAO); the other is an optimization process suggested by experts working in the area of environmental impact assessment (EIA) and land-use (spatial) planning. The criteria on which the comparison of the results of the two site selection processes has been based are spatial organization, environmental impact, safety in terms of potential exposure of the population to radioactivity released from the repository, and feasibility of the repository from the technical, financial/economic and social point of view (the latter relates to consent by the local community for siting the repository). The site selection processes have been compared with the support of the decision expert system named DEX. The results of the comparison indicate that the sites selected by ARAO meet fewer suitability criteria than those identified by applying strategic environmental considerations in the framework of the optimization process. This result stands when taking into account spatial, environmental, safety and technical feasibility points of view. Acceptability of a site by a local community could not have been tested, since the formal site selection process has not yet been concluded; this remains as an uncertain and open point of the comparison. Copyright © 2010 Elsevier Ltd. All rights reserved.
FY16 Summary Report: Participation in the KOSINA Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matteo, Edward N.; Hansen, Francis D.
Salt formations represent a promising host for disposal of nuclear waste in the United States and Germany. Together, these countries provided fully developed safety cases for bedded salt and domal salt, respectively. Today, Germany and the United States find themselves in similar positions with respect to salt formations serving as repositories for heat-generating nuclear waste. German research centers are evaluating bedded and pillow salt formations to contrast with their previous safety case made for the Gorleben dome. Sandia National Laboratories is collaborating on this effort as an Associate Partner, and this report summarizes that teamwork. Sandia and German research groupsmore » have a long-standing cooperative approach to repository science, engineering, operations, safety assessment, testing, modeling and other elements comprising the basis for salt disposal. Germany and the United States hold annual bilateral workshops, which cover a spectrum of issues surrounding the viability of salt formations. Notably, recent efforts include development of a database for features, events, and processes applying broadly and generically to bedded and domal salt. Another international teaming activity evaluates salt constitutive models, including hundreds of new experiments conducted on bedded salt from the Waste Isolation Pilot Plant. These extensive collaborations continue to build the scientific basis for salt disposal. Repository deliberations in the United States are revisiting bedded and domal salt for housing a nuclear waste repository. By agreeing to collaborate with German peers, our nation stands to benefit by assurance of scientific position, exchange of operational concepts, and approach to elements of the safety case, all reflecting cost and time efficiency.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrada, J.J.
This report compiles preliminary information that supports the premise that a repository is needed in Latin America and analyzes the nuclear situation (mainly in Argentina and Brazil) in terms of nuclear capabilities, inventories, and regional spent-fuel repositories. The report is based on several sources and summarizes (1) the nuclear capabilities in Latin America and establishes the framework for the need of a permanent repository, (2) the International Atomic Energy Agency (IAEA) approach for a regional spent-fuel repository and describes the support that international institutions are lending to this issue, (3) the current situation in Argentina in order to analyze themore » Argentinean willingness to find a location for a deep geological repository, and (4) the issues involved in selecting a location for the repository and identifies a potential location. This report then draws conclusions based on an analysis of this information. The focus of this report is mainly on spent fuel and does not elaborate on other radiological waste sources.« less
Influence analysis of Github repositories.
Hu, Yan; Zhang, Jun; Bai, Xiaomei; Yu, Shuo; Yang, Zhuo
2016-01-01
With the support of cloud computing techniques, social coding platforms have changed the style of software development. Github is now the most popular social coding platform and project hosting service. Software developers of various levels keep entering Github, and use Github to save their public and private software projects. The large amounts of software developers and software repositories on Github are posing new challenges to the world of software engineering. This paper tries to tackle one of the important problems: analyzing the importance and influence of Github repositories. We proposed a HITS based influence analysis on graphs that represent the star relationship between Github users and repositories. A weighted version of HITS is applied to the overall star graph, and generates a different set of top influential repositories other than the results from standard version of HITS algorithm. We also conduct the influential analysis on per-month star graph, and study the monthly influence ranking of top repositories.
Creation of Data Repositories to Advance Nursing Science.
Perazzo, Joseph; Rodriguez, Margaret; Currie, Jackson; Salata, Robert; Webel, Allison R
2017-12-01
Data repositories are a strategy in line with precision medicine and big data initiatives, and are an efficient way to maximize data utility and form collaborative research relationships. Nurse researchers are uniquely positioned to make a valuable contribution using this strategy. The purpose of this article is to present a review of the benefits and challenges associated with developing data repositories, and to describe the process we used to develop and maintain a data repository in HIV research. Systematic planning, data collection, synthesis, and data sharing have enabled us to conduct robust cross-sectional and longitudinal analyses with more than 200 people living with HIV. Our repository building has also led to collaboration and training, both in and out of our organization. We present a pragmatic and affordable way that nurse scientists can build and maintain a data repository, helping us continue to make to our understanding of health phenomena.
Wang, Amy Y; Lancaster, William J; Wyatt, Matthew C; Rasmussen, Luke V; Fort, Daniel G; Cimino, James J
2017-01-01
A major challenge in using electronic health record repositories for research is the difficulty matching subject eligibility criteria to query capabilities of the repositories. We propose categories for study criteria corresponding to the effort needed for querying those criteria: "easy" (supporting automated queries), mixed (initial automated querying with manual review), "hard" (fully manual record review), and "impossible" or "point of enrollment" (not typically in health repositories). We obtained a sample of 292 criteria from 20 studies from ClinicalTrials.gov. Six independent reviewers, three each from two academic research institutions, rated criteria according to our four types. We observed high interrater reliability both within and between institutions. The analysis demonstrated typical features of criteria that map with varying levels of difficulty to repositories. We propose using these features to improve enrollment workflow through more standardized study criteria, self-service repository queries, and analyst-mediated retrievals.
Space Telecommunications Radio System (STRS) Application Repository Design and Analysis
NASA Technical Reports Server (NTRS)
Handler, Louis M.
2013-01-01
The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.
[The subject repositories of strategy of the Open Access initiative].
Soares Guimarães, M C; da Silva, C H; Horsth Noronha, I
2012-11-01
The subject repositories are defined as a set of digital objects resulting from the research related to a specific disciplinary field and occupy a still restricted space in the discussion agenda of the Free Access Movement when compared to amplitude reached in the discussion of Institutional Repositories. Although the Subject Repository comes to prominence in the field, especially for the success of initiatives such as the arXiv, PubMed and E-prints, the literature on the subject is recognized as very limited. Despite its roots in the Library and Information Science, and focus on the management of disciplinary collections (subject area literature), there is little information available about the development and management of subject repositories. The following text seeks to make a brief summary on the topic as a way to present the potential to develop subject repositories in order to strengthen the initiative of open access.
Wang, Amy Y.; Lancaster, William J.; Wyatt, Matthew C.; Rasmussen, Luke V.; Fort, Daniel G.; Cimino, James J.
2017-01-01
A major challenge in using electronic health record repositories for research is the difficulty matching subject eligibility criteria to query capabilities of the repositories. We propose categories for study criteria corresponding to the effort needed for querying those criteria: “easy” (supporting automated queries), mixed (initial automated querying with manual review), “hard” (fully manual record review), and “impossible” or “point of enrollment” (not typically in health repositories). We obtained a sample of 292 criteria from 20 studies from ClinicalTrials.gov. Six independent reviewers, three each from two academic research institutions, rated criteria according to our four types. We observed high interrater reliability both within and between institutions. The analysis demonstrated typical features of criteria that map with varying levels of difficulty to repositories. We propose using these features to improve enrollment workflow through more standardized study criteria, self-service repository queries, and analyst-mediated retrievals. PMID:29854246
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molecke, M.A.; Sorensen, N.R.; Wicks, G.G.
The three papers in this report were presented at the second international workshop to feature the Waste Isolation Pilot Plant (WIPP) Materials Interface Interactions Test (MIIT). This Workshop on In Situ Tests on Radioactive Waste Forms and Engineered Barriers was held in Corsendonk, Belgium, on October 13--16, 1992, and was sponsored by the Commission of the European Communities (CEC). The Studiecentrum voor Kernenergie/Centre D`Energie Nucleaire (SCK/CEN, Belgium), and the US Department of Energy (via Savannah River) also cosponsored this workshop. Workshop participants from Belgium, France, Germany, Sweden, and the United States gathered to discuss the status, results and overviews ofmore » the MIIT program. Nine of the twenty-five total workshop papers were presented on the status and results from the WIPP MIIT program after the five-year in situ conclusion of the program. The total number of published MIIT papers is now up to almost forty. Posttest laboratory analyses are still in progress at multiple participating laboratories. The first MIIT paper in this document, by Wicks and Molecke, provides an overview of the entire test program and focuses on the waste form samples. The second paper, by Molecke and Wicks, concentrates on technical details and repository relevant observations on the in situ conduct, sampling, and termination operations of the MIIT. The third paper, by Sorensen and Molecke, presents and summarizes the available laboratory, posttest corrosion data and results for all of the candidate waste container or overpack metal specimens included in the MIIT program.« less
FY15 Report on Thermomechanical Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Buchholz, Stuart
2015-08-01
Sandia is participating in the third phase of a United States (US)-German Joint Project that compares constitutive models and simulation procedures on the basis of model calculations of the thermomechanical behavior and healing of rock salt (Salzer et al. 2015). The first goal of the project is to evaluate the ability of numerical modeling tools to correctly describe the relevant deformation phenomena in rock salt under various influences. Among the numerical modeling tools required to address this are constitutive models that are used in computer simulations for the description of the thermal, mechanical, and hydraulic behavior of the host rockmore » under various influences and for the long-term prediction of this behavior. Achieving this goal will lead to increased confidence in the results of numerical simulations related to the secure disposal of radioactive wastes in rock salt. Results of the Joint Project may ultimately be used to make various assertions regarding stability analysis of an underground repository in salt during the operating phase as well as long-term integrity of the geological barrier in the post-operating phase A primary evaluation of constitutive model capabilities comes by way of predicting large-scale field tests. The Joint Project partners decided to model Waste Isolation Pilot Plant (WIPP) Rooms B & D which are full-scale rooms having the same dimensions. Room D deformed under natural, ambient conditions while Room B was thermally driven by an array of waste-simulating heaters (Munson et al. 1988; 1990). Existing laboratory test data for WIPP salt were carefully scrutinized and the partners decided that additional testing would be needed to help evaluate advanced features of the constitutive models. The German partners performed over 140 laboratory tests on WIPP salt at no charge to the US Department of Energy (DOE).« less
An intelligent content discovery technique for health portal content management.
De Silva, Daswin; Burstein, Frada
2014-04-23
Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira, Eduardo G.A.; Marumo, Julio T.; Vicente, Roberto
2012-07-01
Portland cement materials are widely used as engineered barriers in repositories for radioactive waste. The capacity of such barriers to avoid the disposed of radionuclides to entering the biosphere in the long-term depends on the service life of those materials. Thus, the performance assessment of structural materials under a series of environmental conditions prevailing at the environs of repositories is a matter of interest. The durability of cement paste foreseen as backfill in a deep borehole for disposal of disused sealed radioactive sources is investigated in the development of the repository concept. Results are intended to be part of themore » body of evidence in the safety case of the proposed disposal technology. This paper presents the results of X-Ray Diffraction (XRD) Analysis of cement paste exposed to varying temperatures and simulated groundwater after samples received the radiation dose that the cement paste will accumulate until complete decay of the radioactive sources. The XRD analysis of cement paste samples realized in this work allowed observing some differences in the results of cement paste specimens that were submitted to different treatments. The cluster analysis of results was able to group tested samples according to the applied treatments. Mineralogical differences, however, are tenuous and, apart from ettringite, are hardly observed. The absence of ettringite in all the seven specimens that were kept in dry storage at high temperature had hardly occurred by natural variations in the composition of hydrated cement paste because ettringite is observed in all tested except the seven specimens. Therefore this absence is certainly the result of the treatments and could be explained by the decomposition of ettringite. Although the temperature of decomposition is about 110-120 deg. C, it may be initially decomposed to meta-ettringite, an amorphous compound, above 50 deg. C in the absence of water. Influence of irradiation on the mineralogical composition was not observed when the treatment was analyzed individually or when analyzed under the possible synergic effect with other treatments. However, the radiation dose to which specimens were exposed is only a fraction of the accumulated dose in cement paste until complete decay of some sources. Therefore, in the short term, the conditions deemed to prevail in the repository environment may not influence the properties of cement paste at detectable levels. Under the conditions presented in this work, it is not possible to predict the long term evolution of these properties. (authors)« less
New Features of the re3data Registry of Research Data Repositories
NASA Astrophysics Data System (ADS)
Elger, K.; Pampel, H.; Vierkant, P.; Witt, M.
2016-12-01
re3data is a registry of research data repositories that lists over 1,600 repositories from around the world, making it the largest and most comprehensive online catalog of data repositories on the web. The registry offers researchers, funding agencies, libraries and publishers a comprehensive overview of the heterogeneous landscape of data repositories. The repositories are described, following the "Metadata Schema for the Description of Research Data Repositories". re3data summarises the properties of a repository into a user-friendly icon system helping users to easily identify an adequate repository for the storage of their datasets. The re3data entries are curated by an international, multi-disciplinary editorial board. An application programming interface (API) enables other information systems to list and fetch metadata for integration and interoperability. Funders like the European Commission (2015) and publishers like Springer Nature (2016) recommend the use of re3data.org in their policies. The original re3data project partners are the GFZ German Research Centre for Geosciences, the Humboldt-Universität zu Berlin, the Purdue University Libraries and the Karlsruhe Institute of Technology (KIT). Since 2015 re3data is operated as a service of DataCite, a global non-profit organisation that provides persistent identifiers (DOIs) for research data. At the 2016 AGU Fall Meeting we will describe the current status of re3data. An overview of the major developments and new features will be given. Furthermore, we will present our plans to increase the quality of the re3data entries.
NASA Astrophysics Data System (ADS)
Badwe, Sunil
In the nuclear repository conditions, the nuclear waste package wall surfaces will be at elevated temperatures because of the heat generated by fission reactions within the waste. It is anticipated that the ground water may contain varying levels of anions such as chloride, nitrate, sulfate picked up from the rocks. The ground waters could seep through the rock faults and drip on to the waste packages. The dripped water will evaporate due to the heat from the nuclear waste leaving behind concentrated brine which eventually becomes dry salt deposit. The multi-ionic salts in the ground water are expected to be hygroscopic in nature. The next drop of water falling at the same place or the humidity in the repository will transform the hygroscopic salt deposit into a more concentrated brine. This cycle will continue for years and eventually a potentially corrosive brine will be formed on the waste package surface. Hence the waste package surface goes through the alternate wet-dry cycles. These conditions indicate that the concentration and pH of the environment in the repository vary considerably. The conventional corrosion tests hardly simulate these varying environmental conditions. Hence there has been a need to develop an electrochemical test that could closely simulate the anticipated repository conditions stated above. In this research, a new electrochemical method, called as Heated Surface Corrosion testing (HSCT) has been devised and tested. In the conventional testing the electrolyte is heated and in HSCT the working electrode is heated. The present study employs the temperature of 80°C which may be one of the temperatures of the waste package surface. The new HSCT was validated by testing stainless steel type 304. The HSCT was observed to be more aggressive than the conventional tests. Initiation of pitting of SS 304 in chloride solution (pH 3) occurred at much shorter exposure times in the HSCT condition than the exposure time required for pitting in conventional testing. The reduced time to pitting demonstrated the capability of HSCT to impose repository more corrosive conditions. The stability of the passive film of stainless alloys under the hygroscopic salt layers could be determined using this technique. Alloy 22, a nickel base Ni-22Cr-13Mo-3W alloy has an excellent corrosion resistance in oxidizing and reducing environments. Corrosion behavior of Alloy 22 was evaluated using the newly devised HSCT method in simulated acidified water (SAW), simulated concentrated water (SCW) and in pure chloride (pH 3 and 8) environments. In this method, the concentration of the environment varied with test duration. Alloy 22 was evaluated in four different heat treated conditions viz. (a) mill annealed, (b) 610°C/1 h-representing Cr depletion, (c) 650°C/100 h-representing Mo+Cr depletion, (d) 800°C/100 h-representing Mo depletion. The corrosion rate of mill annealed Alloy 22 was not affected by the continuous increase in ionic strength of the SAW (pH 3) environment. Passivation kinetics was faster with increase in concentration of the electrolytes. The major difference between the conventional test and HSCT was the aging characteristics of the passive film of Alloy 22. Cyclic polarization was carried out on Alloy 22 in conventional ASTM G61 and HSCT method to compare. The electrochemical response of Alloy 22 was the same by heating the electrolyte or heating the electrode. The corrosion behavior of Alloy 22 was investigated in three different aged conditions using HSCT approach in two different electrolytes. The thermal aging conditions of the specimens introduced depletion of chromium and molybdenum near the grain boundaries/phase boundaries. Long-term exposure tests (up to 850 h) were conducted in simulated acidified water (SAW, pH 3) and simulated concentrated water (SCW, pH 8) at 80°C. Corrosion potential, corrosion current and passive current decay exponent were determined at regular intervals. The specimens aged at 610°C/1 h and 800°C/100 h showed almost identical corrosion behaviors in the SAW environment. The specimen aged at 650°C/100 h showed lower corrosion resistance in the SAW environment indicating the effect of Mo-depletion profile near the grain boundaries. The specimen aged at 800°C for 100 h showed lower corrosion resistance in the SCW environment because of possible dissolution of the Mo-rich precipitates. Compared to the mill annealed condition, the aged specimens showed approximately an order of magnitude higher corrosion current in the SAW environment and almost similar corrosion currents in the SCW environment. Results also indicate that the passivity of Alloy 22, both in mill annealed and in aged conditions was not hampered during dry-out/rewet cycles. Presence of nitrate and other oxyanions in the SAW environment reduced the charge required to form a stable passive film of alloy 22 aged samples as compared to the charge passed in the pure chloride pH 3 environments. The passive film of the aged Alloy 22 specimens exposed to pure chloride solutions showed predominantly n-type semiconducting behavior and the on-set of p-type semiconductivity at higher potentials. The charge carrier density of the passive film of Alloy 22 varied in the range 1.5-9.0 x 10 21/cm3. The predominant charge carriers could be oxygen vacancies. Increase in the charge carrier density was observed in the specimen aged at 800°C/100 h when exposed to pH 3 solution as compared to exposure in pH 8 solution. In Summary, Alloy 22 sustained the heated surface corrosion test without any appreciable surface attack in the simulated repository environments as well as the more corrosive chloride environments.
10 CFR 960.5-2-3 - Meteorology.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines Preclosure Radiological Safety § 960.5-2-3 Meteorology. (a) Qualifying condition. The site shall be located such that expected meteorological conditions during repository.... Prevailing meteorological conditions such that any radioactive releases to the atmosphere during repository...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karasaki, K.; Galloway, D.
1991-06-01
The planned high-level nuclear waste repository at Yucca Mountain, Nevada, would exist in unsaturated, fractured welded tuff. One possible contaminant pathway to the accessible environment is transport by groundwater infiltrating to the water table and flowing through the saturated zone. Therefore, an effort to characterize the hydrology of the saturated zone is being undertaken in parallel with that of the unsaturated zone. As a part of the saturated zone investigation, there wells-UE-25c{number_sign}1, UE-25c{number_sign}2, and UE-25c{number_sign}3 (hereafter called the c-holes)-were drilled to study hydraulic and transport properties of rock formations underlying the planned waste repository. The location of the c-holes ismore » such that the formations penetrated in the unsaturated zone occur at similar depths and with similar thicknesses as at the planned repository site. In characterizing a highly heterogeneous flow system, several issues emerge. (1) The characterization strategy should allow for the virtual impossibility to enumerate and characterize all heterogeneities. (2) The methodology to characterize the heterogeneous flow system at the scale of the well tests needs to be established. (3) Tools need to be developed for scaling up the information obtained at the well-test scale to the larger scale of the site. In the present paper, the characterization strategy and the methods under development are discussed with the focus on the design and analysis of the field experiments at the c-holes.« less
NASA Astrophysics Data System (ADS)
Ward, Dennis W.; Bennett, Kelly W.
2017-05-01
The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)
New directions in medical e-curricula and the use of digital repositories.
Fleiszer, David M; Posel, Nancy H; Steacy, Sean P
2004-03-01
Medical educators involved in the growth of multimedia-enhanced e-curricula are increasingly aware of the need for digital repositories to catalogue, store and ensure access to learning objects that are integrated within their online material. The experience at the Faculty of Medicine at McGill University during initial development of a mainstream electronic curriculum reflects this growing recognition that repositories can facilitate the development of a more comprehensive as well as effective electronic curricula. Also, digital repositories can help to ensure efficient utilization of resources through the use, re-use, and reprocessing of multimedia learning, addressing the potential for collaboration among repositories and increasing available material exponentially. The authors review different approaches to the development of a digital repository application, as well as global and specific issues that should be examined in the initial requirements definition and development phase, to ensure current initiatives meet long-term requirements. Often, decisions regarding creation of e-curricula and associated digital repositories are left to interested faculty and their individual development teams. However, the development of an e-curricula and digital repository is not predominantly a technical exercise, but rather one that affects global pedagogical strategies and curricular content and involves a commitment of large-scale resources. Outcomes of these decisions can have long-term consequences and as such, should involve faculty at the highest levels including the dean.
Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock
NASA Astrophysics Data System (ADS)
Hadgu, T.; Gomez, S. P.; Matteo, E. N.
2017-12-01
Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.
The design and implementation of image query system based on color feature
NASA Astrophysics Data System (ADS)
Yao, Xu-Dong; Jia, Da-Chun; Li, Lin
2013-07-01
ASP.NET technology was used to construct the B/S mode image query system. The theory and technology of database design, color feature extraction from image, index and retrieval in the construction of the image repository were researched. The campus LAN and WAN environment were used to test the system. From the test results, the needs of user queries about related resources were achieved by system architecture design.
10 CFR 960.5-2-5 - Environmental quality.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines Environment, Socioeconomics, and Transportation § 960.5-2-5 Environmental... repository siting, construction, operation, closure, and decommissioning, and projected environmental impacts... of the repository or its support facilities on, a component of the National Park System, the National...
10 CFR 60.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General..., special nuclear, and byproduct material at a geologic repository operations area sited, constructed, or... at a geologic repository operations area sited, constructed, or operated at Yucca Mountain, Nevada...
10 CFR 60.31 - Construction authorization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORIES Licenses Construction Authorization § 60.31 Construction authorization. Upon review and... in a geologic repository operations area of the design proposed without unreasonable risk to the...: (1) DOE has described the proposed geologic repository including but not limited to: (i) The geologic...
10 CFR 960.5-2-6 - Socioeconomic impacts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines Environment, Socioeconomics, and Transportation § 960.5-2-6 Socioeconomic... and/or economic impacts induced in communities and surrounding regions by repository siting... significant repository-related impacts on community services, housing supply and demand, and the finances of...
10 CFR 60.15 - Site characterization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... the geologic repository to the extent practical. (2) The number of exploratory boreholes and shafts... characterization. (3) To the extent practical, exploratory boreholes and shafts in the geologic repository...
10 CFR 960.5-1 - System guidelines.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Preclosure Guidelines § 960.5-1 System guidelines. (a) Qualifying conditions—(1) Preclosure... radioactive materials to restricted and unrestricted areas during repository operation and closure shall meet... repository siting, construction, operation, closure, and decommissioning the public and the environment shall...
GENESI-DR - A single access point to Earth Science data
NASA Astrophysics Data System (ADS)
Cossu, R.; Goncalves, P.; Pacini, F.
2009-04-01
The amount of information being generated about our planet is increasing at an exponential rate, but it must be easily accessible in order to apply it to the global needs relating to the state of the Earth. Currently, information about the state of the Earth, relevant services, analysis results, applications and tools are accessible in a very scattered and uncoordinated way, often through individual initiatives from Earth Observation mission operators, scientific institutes dealing with ground measurements, service companies, data catalogues, etc. A dedicated infrastructure providing transparent access to all this will support Earth Science communities by allowing them to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The use of high-speed networks (GÉANT) and the experimentation of new technologies, like BitTorrent, will also contribute to better services for the Earth Science communities. GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories), an ESA-led, European Commission (EC)-funded two-year project, is taking the lead in providing reliable, easy, long-term access to Earth Science data via the Internet. This project will allow scientists from different Earth Science disciplines located across Europe to locate, access, combine and integrate historical and fresh Earth-related data from space, airborne and in-situ sensors archived in large distributed repositories. GENESI-DR builds a federated collection of heterogeneous digital Earth Science repositories to establish a dedicated infrastructure providing transparent access to all this and allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The federated digital repositories, seen as services and data providers, will share access to their resources (catalogue functions, data access, processing services etc.) and will adhere to a common set of standards / policies / interfaces. The end-users will be provided with a virtual collection of digital Earth Science data, irrespectively of their location in the various single federated repositories. GENESI-DR objectives have lead to the identification of the basic GENESI-DR infrastructure requirements: • Capability, for Earth Science users, to discover data from different European Earth Science Digital Repositories through the same interface in a transparent and homogeneous way; • Easiness and speed of access to large volumes of coherently maintained distributed data in an effective and timely way; • Capability, for DR owners, to easily make available their data to a significantly increased audience with no need to duplicate them in a different storage system. Data discovery is based on a Central Discovery Service, which allows users and applications to easily query information about data collections and products existing in heterogeneous catalogues, at federated DR sites. This service can be accessed by users via web interface, the GENESI-DR Web Portal, or by external applications via open standardized interfaces exposed by the system. The Central Discovery Service identifies the DRs providing products complying with the user search criteria and returns the corresponding access points to the requester. By taking into consideration different and efficient data transfer technologies such as HTTPS, GridFTP and BitTorrent, the infrastructure provides easiness and speed of access. Conversely, for data publishing GENESI-DR provides several mechanisms to assist DR owners in producing a metadata catalogues. In order to reach its objectives, the GENESI-DR e-Infrastructure will be validated against user needs for accessing and sharing Earth Science data. Initially, four specific applications in the land, atmosphere and marine domains have been selected, including: • Near real time orthorectification for agricultural crops monitoring • Urban area mapping in support of emergency response • Data assimilation in GlobModel, addressing major environmental and health issues in Europe, with a particular focus on air quality • SeaDataNet to aid environmental assessments and to forecast the physical state of the oceans in near real time. Other applications will complement this during the second half of the project. GENESI-DR also aims to develop common approaches to preserve the historical archives and the ability to access the derived user information as both software and hardware transformations occur. Ensuring access to Earth Science data for future generations is of utmost importance because it allows for the continuity of knowledge generation improvement. For instance, scientists accessing today's climate change data in 50 years will be able to better understand and detect trends in global warming and apply this knowledge to ongoing natural phenomena. GENESI-DR will work towards harmonising operations and applying approved standards, policies and interfaces at key Earth Science data repositories. To help with this undertaking, GENESI-DR will establish links with the relevant organisations and programmes such as space agencies, institutional environmental programmes, international Earth Science programmes and standardisation bodies.
Bytautas, Jessica P; Gheihman, Galina; Dobrow, Mark J
2017-04-01
Quality improvement (QI) is becoming an important focal point for health systems. There is increasing interest among health system stakeholders to learn from and share experiences on the use of QI methods and approaches in their work. Yet there are few easily accessible, online repositories dedicated to documenting QI activity. We conducted a scoping review of publicly available, web-based QI repositories to (i) identify current approaches to sharing information on QI practices; (ii) categorise these approaches based on hosting, scope and size, content acquisition and eligibility, content format and search, and evaluation and engagement characteristics; and (iii) review evaluations of the design, usefulness and impact of their online QI practice repositories. The search strategy consisted of traditional database and grey literature searches, as well as expert consultation, with the ultimate aim of identifying and describing QI repositories of practices undertaken in a healthcare context. We identified 13 QI repositories and found substantial variation across the five categories. The QI repositories used different terminology (eg, practices vs case studies) and approaches to content acquisition, and varied in terms of primary areas of focus. All provided some means for organising content according to categories or themes and most provided at least rudimentary keyword search functionality. Notably, none of the QI repositories included evaluations of their impact. With growing interest in sharing and spreading best practices and increasing reliance on QI as a key contributor to health system performance, the role of QI repositories is likely to expand. Designing future QI repositories based on knowledge of the range and type of features available is an important starting point for improving their usefulness and impact. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Assessing the utility of eDNA as a tool to survey reef-fish communities in the Red Sea
NASA Astrophysics Data System (ADS)
DiBattista, Joseph D.; Coker, Darren J.; Sinclair-Taylor, Tane H.; Stat, Michael; Berumen, Michael L.; Bunce, Michael
2017-12-01
Relatively small volumes of water may contain sufficient environmental DNA (eDNA) to detect target aquatic organisms via genetic sequencing. We therefore assessed the utility of eDNA to document the diversity of coral reef fishes in the central Red Sea. DNA from seawater samples was extracted, amplified using fish-specific 16S mitochondrial DNA primers, and sequenced using a metabarcoding workflow. DNA sequences were assigned to taxa using available genetic repositories or custom genetic databases generated from reference fishes. Our approach revealed a diversity of conspicuous, cryptobenthic, and commercially relevant reef fish at the genus level, with select genera in the family Labridae over-represented. Our approach, however, failed to capture a significant fraction of the fish fauna known to inhabit the Red Sea, which we attribute to limited spatial sampling, amplification stochasticity, and an apparent lack of sequencing depth. Given an increase in fish species descriptions, completeness of taxonomic checklists, and improvement in species-level assignment with custom genetic databases as shown here, we suggest that the Red Sea region may be ideal for further testing of the eDNA approach.
GDA, a web-based tool for Genomics and Drugs integrated analysis.
Caroli, Jimmy; Sorrentino, Giovanni; Forcato, Mattia; Del Sal, Giannino; Bicciato, Silvio
2018-05-25
Several major screenings of genetic profiling and drug testing in cancer cell lines proved that the integration of genomic portraits and compound activities is effective in discovering new genetic markers of drug sensitivity and clinically relevant anticancer compounds. Despite most genetic and drug response data are publicly available, the availability of user-friendly tools for their integrative analysis remains limited, thus hampering an effective exploitation of this information. Here, we present GDA, a web-based tool for Genomics and Drugs integrated Analysis that combines drug response data for >50 800 compounds with mutations and gene expression profiles across 73 cancer cell lines. Genomic and pharmacological data are integrated through a modular architecture that allows users to identify compounds active towards cancer cell lines bearing a specific genomic background and, conversely, the mutational or transcriptional status of cells responding or not-responding to a specific compound. Results are presented through intuitive graphical representations and supplemented with information obtained from public repositories. As both personalized targeted therapies and drug-repurposing are gaining increasing attention, GDA represents a resource to formulate hypotheses on the interplay between genomic traits and drug response in cancer. GDA is freely available at http://gda.unimore.it/.
Establishment and operation of a biorepository for molecular epidemiologic studies in Costa Rica.
Cortés, Bernal; Schiffman, Mark; Herrero, Rolando; Hildesheim, Allan; Jiménez, Silvia; Shea, Katheryn; González, Paula; Porras, Carolina; Fallas, Greivin; Rodríguez, Ana Cecilia
2010-04-01
The Proyecto Epidemiológico Guanacaste (PEG) has conducted several large studies related to human papillomavirus (HPV) and cervical cancer in Guanacaste, Costa Rica in a long-standing collaboration with the U.S. National Cancer Institute. To improve molecular epidemiology efforts and save costs, we have gradually transferred technology to Costa Rica, culminating in state-of-the-art laboratories and a biorepository to support a phase III clinical trial investigating the efficacy of HPV 16/18 vaccine. Here, we describe the rationale and lessons learned in transferring molecular epidemiologic and biorepository technology to a developing country. At the outset of the PEG in the early 1990s, we shipped all specimens to repositories and laboratories in the United States, which created multiple problems. Since then, by intensive personal interactions between experts from the United States and Costa Rica, we have successfully transferred liquid-based cytology, HPV DNA testing and serology, chlamydia and gonorrhea testing, PCR-safe tissue processing, and viable cryopreservation. To accommodate the vaccine trial, a state-of-the-art repository opened in mid-2004. Approximately 15,000 to 50,000 samples are housed in the repository on any given day, and >500,000 specimens have been shipped, many using a custom-made dry shipper that permits exporting >20,000 specimens at a time. Quality control of shipments received by the NCI biorepository has revealed an error rate of <0.2%. Recently, the PEG repository has incorporated other activities; for example, large-scale aliquotting and long-term, cost-efficient storage of frozen specimens returned from the United States. Using Internet-based specimen tracking software has proven to be efficient even across borders. For long-standing collaborations, it makes sense to transfer the molecular epidemiology expertise toward the source of specimens. The successes of the PEG molecular epidemiology laboratories and biorepository prove that the physical and informatics infrastructures of a modern biorepository can be transferred to a resource-limited and weather-challenged region. Technology transfer is an important and feasible goal of international collaborations.
Smith, B. Eugene; Johnston, Mark K.; Lücking, Robert
2016-01-01
Accuracy of taxonomic identifications is crucial to data quality in online repositories of species occurrence data, such as the Global Biodiversity Information Facility (GBIF), which have accumulated several hundred million records over the past 15 years. These data serve as basis for large scale analyses of macroecological and biogeographic patterns and to document environmental changes over time. However, taxonomic identifications are often unreliable, especially for non-vascular plants and fungi including lichens, which may lack critical revisions of voucher specimens. Due to the scale of the problem, restudy of millions of collections is unrealistic and other strategies are needed. Here we propose to use verified, georeferenced occurrence data of a given species to apply predictive niche modeling that can then be used to evaluate unverified occurrences of that species. Selecting the charismatic lichen fungus, Usnea longissima, as a case study, we used georeferenced occurrence records based on sequenced specimens to model its predicted niche. Our results suggest that the target species is largely restricted to a narrow range of boreal and temperate forest in the Northern Hemisphere and that occurrence records in GBIF from tropical regions and the Southern Hemisphere do not represent this taxon, a prediction tested by comparison with taxonomic revisions of Usnea for these regions. As a novel approach, we employed Principal Component Analysis on the environmental grid data used for predictive modeling to visualize potential ecogeographical barriers for the target species; we found that tropical regions conform a strong barrier, explaining why potential niches in the Southern Hemisphere were not colonized by Usnea longissima and instead by morphologically similar species. This approach is an example of how data from two of the most important biodiversity repositories, GenBank and GBIF, can be effectively combined to remotely address the problem of inaccuracy of taxonomic identifications in occurrence data repositories and to provide a filtering mechanism which can considerably reduce the number of voucher specimens that need critical revision, in this case from 4,672 to about 100. PMID:26967999
Smith, B Eugene; Johnston, Mark K; Lücking, Robert
2016-01-01
Accuracy of taxonomic identifications is crucial to data quality in online repositories of species occurrence data, such as the Global Biodiversity Information Facility (GBIF), which have accumulated several hundred million records over the past 15 years. These data serve as basis for large scale analyses of macroecological and biogeographic patterns and to document environmental changes over time. However, taxonomic identifications are often unreliable, especially for non-vascular plants and fungi including lichens, which may lack critical revisions of voucher specimens. Due to the scale of the problem, restudy of millions of collections is unrealistic and other strategies are needed. Here we propose to use verified, georeferenced occurrence data of a given species to apply predictive niche modeling that can then be used to evaluate unverified occurrences of that species. Selecting the charismatic lichen fungus, Usnea longissima, as a case study, we used georeferenced occurrence records based on sequenced specimens to model its predicted niche. Our results suggest that the target species is largely restricted to a narrow range of boreal and temperate forest in the Northern Hemisphere and that occurrence records in GBIF from tropical regions and the Southern Hemisphere do not represent this taxon, a prediction tested by comparison with taxonomic revisions of Usnea for these regions. As a novel approach, we employed Principal Component Analysis on the environmental grid data used for predictive modeling to visualize potential ecogeographical barriers for the target species; we found that tropical regions conform a strong barrier, explaining why potential niches in the Southern Hemisphere were not colonized by Usnea longissima and instead by morphologically similar species. This approach is an example of how data from two of the most important biodiversity repositories, GenBank and GBIF, can be effectively combined to remotely address the problem of inaccuracy of taxonomic identifications in occurrence data repositories and to provide a filtering mechanism which can considerably reduce the number of voucher specimens that need critical revision, in this case from 4,672 to about 100.
Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment
NASA Astrophysics Data System (ADS)
Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.
2017-12-01
The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.SAND2017-8198A.
Migration of the Gaudi and LHCb software repositories from CVS to Subversion
NASA Astrophysics Data System (ADS)
Clemencic, M.; Degaudenzi, H.; LHCb Collaboration
2011-12-01
A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.
ROSA P : The National Transportation Library’s Repository and Open Science Access Portal
DOT National Transportation Integrated Search
2018-01-01
The National Transportation Library (NTL) was founded as an all-digital repository of US DOT research reports, technical publications and data products. NTLs primary public offering is ROSA P, the Repository and Open Science Access Portal. An open...
10 CFR 960.3-3 - Consultation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY..., operation, closure, decommissioning, licensing, or regulation of a repository. Written responses to written... purpose of determining the suitability of such area for the development of a repository, the DOE shall...
10 CFR 60.32 - Conditions of construction authorization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... GEOLOGIC REPOSITORIES Licenses Construction Authorization § 60.32 Conditions of construction authorization... changes to the features of the geologic repository and the procedures authorized. The restrictions that... setting as well as measures related to the design and construction of the geologic repository operations...
An Automated Acquisition System for Media Exploitation
2008-06-01
on the acquisition station, AcqMan will pull out the SHA256 image hash, and the device’s model, serial number, and manufacturer. 2. Query the ADOMEX...Repository Using the data collected above, AcqMan will query the ADOMEX repository. The ADOMEX repository will respond to the query with the SHA256 ’s of...whose SHA256s do not match. The last category will be a list of images that the ADOMEX repository already has and that the acquisition station can
Case for retrievable high-level nuclear waste disposal
Roseboom, Eugene H.
1994-01-01
Plans for the nation's first high-level nuclear waste repository have called for permanently closing and sealing the repository soon after it is filled. However, the hydrologic environment of the proposed site at Yucca Mountain, Nevada, should allow the repository to be kept open and the waste retrievable indefinitely. This would allow direct monitoring of the repository and maintain the options for future generations to improve upon the disposal methods or use the uranium in the spent fuel as an energy resource.
Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.
NASA Astrophysics Data System (ADS)
Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.
2016-12-01
There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.
Data Services and Transnational Access for European Geosciences Multi-Scale Laboratories
NASA Astrophysics Data System (ADS)
Funiciello, Francesca; Rosenau, Matthias; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Trippanera, Daniele; Spires, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst
2016-04-01
The EC policy for research in the new millennium supports the development of european-scale research infrastructures. In this perspective, the existing research infrastructures are going to be integrated with the objective to increase their accessibility and to enhance the usability of their multidisciplinary data. Building up integrating Earth Sciences infrastructures in Europe is the mission of the Implementation Phase (IP) of the European Plate Observing System (EPOS) project (2015-2019). The integration of european multiscale laboratories - analytical, experimental petrology and volcanology, magnetic and analogue laboratories - plays a key role in this context and represents a specific task of EPOS IP. In the frame of the WP16 of EPOS IP working package 16, European geosciences multiscale laboratories aims to be linked, merging local infrastructures into a coherent and collaborative network. In particular, the EPOS IP WP16-task 4 "Data services" aims at standardize data and data products, already existing and newly produced by the participating laboratories, and made them available through a new digital platform. The following data and repositories have been selected for the purpose: 1) analytical and properties data a) on volcanic ash from explosive eruptions, of interest to the aviation industry, meteorological and government institutes, b) on magmas in the context of eruption and lava flow hazard evaluation, and c) on rock systems of key importance in mineral exploration and mining operations; 2) experimental data describing: a) rock and fault properties of importance for modelling and forecasting natural and induced subsidence, seismicity and associated hazards, b) rock and fault properties relevant for modelling the containment capacity of rock systems for CO2, energy sources and wastes, c) crustal and upper mantle rheology as needed for modelling sedimentary basin formation and crustal stress distributions, d) the composition, porosity, permeability, and frackability of reservoir rocks of interest in relation to unconventional resources and geothermal energy; 3) repository of analogue models on tectonic processes, from the plate to the reservoir scale, relevant to the understanding of Earth dynamics, geo-hazards and geo-energy; 4) paleomagnetic data, that are crucial a) for understanding the evolution of sedimentary basins and associated resources, and b) for charting geo-hazard frequency. EPOS IP WP16 - task 5 aims to create mechanisms and procedures for easy trans-national access to multiscale laboratory facilities. Moreover, the same task will coordinate all the activities in a pilot phase to test, validate and consolidate the over mentioned services and to provide a proof of concept for what will be offered beyond the completion of the EPOS IP.
10 CFR 60.131 - General design criteria for the geologic repository operations area.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., systems, and components important to safety shall be designed to withstand dynamic effects such as missile... radioactivity areas; and (6) A radiation alarm system to warn of significant increases in radiation levels... system shall be designed with provisions for calibration and for testing its operability. (b) Protection...
USDA-ARS?s Scientific Manuscript database
The elemental content of a soybean seed is a determined by both genetic and environmental factors and is an important component of its nutritional value. The elemental content is stable, making the samples stored in germplasm repositories an intriguing source of experimental material. To test the ef...
Unified Database Development Program. Final Report.
ERIC Educational Resources Information Center
Thomas, Everett L., Jr.; Deem, Robert N.
The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…
ERIC Educational Resources Information Center
Park, Sanghoon; McLeod, Kenneth
2018-01-01
Open Educational Resources (OER) can offer educators the necessary flexibility for tailoring educational resources to better fit their educational goals. Although the number of OER repositories is growing fast, few studies have been conducted to empirically test the effectiveness of OER integration in the classroom. Furthermore, very little is…
Pretest characterization of WIPP experimental waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J.; Davis, H.; Drez, P.E.
The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is an underground repository designed for the storage and disposal of transuranic (TRU) wastes from US Department of Energy (DOE) facilities across the country. The Performance Assessment (PA) studies for WIPP address compliance of the repository with applicable regulations, and include full-scale experiments to be performed at the WIPP site. These experiments are the bin-scale and alcove tests to be conducted by Sandia National Laboratories (SNL). Prior to conducting these experiments, the waste to be used in these tests needs to be characterized to provide data on the initial conditionsmore » for these experiments. This characterization is referred to as the Pretest Characterization of WIPP Experimental Waste, and is also expected to provide input to other programmatic efforts related to waste characterization. The purpose of this paper is to describe the pretest waste characterization activities currently in progress for the WIPP bin-scale waste, and to discuss the program plan and specific analytical protocols being developed for this characterization. The relationship between different programs and documents related to waste characterization efforts is also highlighted in this paper.« less
Thermo-hydrological and chemical (THC) modeling to support Field Test Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stauffer, Philip H.; Jordan, Amy B.; Harp, Dylan Robert
This report summarizes ongoing efforts to simulate coupled thermal-hydrological-chemical (THC) processes occurring within a hypothetical high-level waste (HLW) repository in bedded salt. The report includes work completed since the last project deliverable, “Coupled model for heat and water transport in a high level waste repository in salt”, a Level 2 milestone submitted to DOE in September 2013 (Stauffer et al., 2013). Since the last deliverable, there have been code updates to improve the integration of the salt module with the pre-existing code and development of quality assurance (QA) tests of constitutive functions and precipitation/dissolution reactions. Simulations of bench-scale experiments, bothmore » historical and currently in the planning stages have been performed. Additional simulations have also been performed on the drift-scale model that incorporate new processes, such as an evaporation function to estimate water vapor removal from the crushed salt backfill and isotopic fractionation of water isotopes. Finally, a draft of a journal paper on the importance of clay dehydration on water availability is included as Appendix I.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, C.A.; Rickard, W.H. Jr.; Biehert, R.W.
1989-01-01
The Basalt Waste Isolation Project (BWIP) was undertaken to environmentally characterize a portion of the US Department of Energy's Hanford Site in Washington State as a potential host for the nation's first mined commercial nuclear waste repository. Studies were terminated by Congress in 1987. Between 1976 and 1987, 72 areas located across the Hanford Site were disturbed by the BWIP. These areas include borehole pads, a large Exploratory Shaft Facility, and the Near Surface Test Facility. Most boreholes were cleared of vegetation, leveled, and stabilized with a thick layer of compacted pit-run gravel and sand. The Near Surface Test Facilitymore » consists of three mined adits, a rock-spoils bench, and numerous support facilities. Restoration began in 1988 with the objective of returning sites to pre-existing conditions using native species. The Hanford Site retains some of the last remnants of the shrub-steppe ecosystem in Washington. The primary constraints to restoring native vegetation at Hanford are low precipitation and the presence of cheatgrass, an extremely capable alien competitor. 5 figs.« less
NASA Astrophysics Data System (ADS)
Altini, V.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Divià, R.; Fuchs, U.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soòs, C.; Vande Vyvre, P.; Von Haller, B.; ALICE Collaboration
2010-04-01
All major experiments need tools that provide a way to keep a record of the events and activities, both during commissioning and operations. In ALICE (A Large Ion Collider Experiment) at CERN, this task is performed by the Alice Electronic Logbook (eLogbook), a custom-made application developed and maintained by the Data-Acquisition group (DAQ). Started as a statistics repository, the eLogbook has evolved to become not only a fully functional electronic logbook, but also a massive information repository used to store the conditions and statistics of the several online systems. It's currently used by more than 600 users in 30 different countries and it plays an important role in the daily ALICE collaboration activities. This paper will describe the LAMP (Linux, Apache, MySQL and PHP) based architecture of the eLogbook, the database schema and the relevance of the information stored in the eLogbook to the different ALICE actors, not only for near real time procedures but also for long term data-mining and analysis. It will also present the web interface, including the different used technologies, the implemented security measures and the current main features. Finally it will present the roadmap for the future, including a migration to the web 2.0 paradigm, the handling of the database ever-increasing data volume and the deployment of data-mining tools.
Cieslewicz, Artur; Dutkiewicz, Jakub; Jedrzejek, Czeslaw
2018-01-01
Abstract Information retrieval from biomedical repositories has become a challenging task because of their increasing size and complexity. To facilitate the research aimed at improving the search for relevant documents, various information retrieval challenges have been launched. In this article, we present the improved medical information retrieval systems designed by Poznan University of Technology and Poznan University of Medical Sciences as a contribution to the bioCADDIE 2016 challenge—a task focusing on information retrieval from a collection of 794 992 datasets generated from 20 biomedical repositories. The system developed by our team utilizes the Terrier 4.2 search platform enhanced by a query expansion method using word embeddings. This approach, after post-challenge modifications and improvements (with particular regard to assigning proper weights for original and expanded terms), allowed us achieving the second best infNDCG measure (0.4539) compared with the challenge results and infAP 0.3978. This demonstrates that proper utilization of word embeddings can be a valuable addition to the information retrieval process. Some analysis is provided on related work involving other bioCADDIE contributions. We discuss the possibility of improving our results by using better word embedding schemes to find candidates for query expansion. Database URL: https://biocaddie.org/benchmark-data PMID:29688372
Bialecki, Brian; Park, James; Tilkin, Mike
2016-08-01
The intent of this project was to use object storage and its database, which has the ability to add custom extensible metadata to an imaging object being stored within the system, to harness the power of its search capabilities, and to close the technology gap that healthcare faces. This creates a non-disruptive tool that can be used natively by both legacy systems and the healthcare systems of today which leverage more advanced storage technologies. The base infrastructure can be populated alongside current workflows without any interruption to the delivery of services. In certain use cases, this technology can be seen as a true alternative to the VNA (Vendor Neutral Archive) systems implemented by healthcare today. The scalability, security, and ability to process complex objects makes this more than just storage for image data and a commodity to be consumed by PACS (Picture Archiving and Communication System) and workstations. Object storage is a smart technology that can be leveraged to create vendor independence, standards compliance, and a data repository that can be mined for truly relevant content by adding additional context to search capabilities. This functionality can lead to efficiencies in workflow and a wealth of minable data to improve outcomes into the future.
BioSurfDB: knowledge and algorithms to support biosurfactants and biodegradation studies
Oliveira, Jorge S.; Araújo, Wydemberg; Lopes Sales, Ana Isabela; de Brito Guerra, Alaine; da Silva Araújo, Sinara Carla; de Vasconcelos, Ana Tereza Ribeiro; Agnez-Lima, Lucymara F.; Freitas, Ana Teresa
2015-01-01
Crude oil extraction, transportation and use provoke the contamination of countless ecosystems. Therefore, bioremediation through surfactants mobilization or biodegradation is an important subject, both economically and environmentally. Bioremediation research had a great boost with the recent advances in Metagenomics, as it enabled the sequencing of uncultured microorganisms providing new insights on surfactant-producing and/or oil-degrading bacteria. Many research studies are making available genomic data from unknown organisms obtained from metagenomics analysis of oil-contaminated environmental samples. These new datasets are presently demanding the development of new tools and data repositories tailored for the biological analysis in a context of bioremediation data analysis. This work presents BioSurfDB, www.biosurfdb.org, a curated relational information system integrating data from: (i) metagenomes; (ii) organisms; (iii) biodegradation relevant genes; proteins and their metabolic pathways; (iv) bioremediation experiments results, with specific pollutants treatment efficiencies by surfactant producing organisms; and (v) a biosurfactant-curated list, grouped by producing organism, surfactant name, class and reference. The main goal of this repository is to gather information on the characterization of biological compounds and mechanisms involved in biosurfactant production and/or biodegradation and make it available in a curated way and associated with a number of computational tools to support studies of genomic and metagenomic data. Database URL: www.biosurfdb.org PMID:25833955
10 CFR 960.4 - Postclosure guidelines.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Postclosure Guidelines § 960.4 Postclosure guidelines. The guidelines in this subpart specify the factors to be considered in evaluating and comparing sites on the basis of expected repository performance... NRC and EPA regulations. These requirements must be met by the repository system, which contains...
10 CFR 60.3 - License required.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General... byproduct material at a geologic repository operations area except as authorized by a license issued by the Commission pursuant to this part. (b) DOE shall not commence construction of a geologic repository operations...
Asset Reuse of Images from a Repository
ERIC Educational Resources Information Center
Herman, Deirdre
2014-01-01
According to Markus's theory of reuse, when digital repositories are deployed to collect and distribute organizational assets, they supposedly help ensure accountability, extend information exchange, and improve productivity. Such repositories require a large investment due to the continuing costs of hardware, software, user licenses, training,…
10 CFR 60.17 - Contents of site characterization plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... GEOLOGIC REPOSITORIES Licenses Preapplication Review § 60.17 Contents of site characterization plan. The... construction authorization for a geologic repository operations area; (4) Criteria, developed pursuant to... area for the location of a geologic repository; and (5) Any other information which the Commission, by...
10 CFR 960.4-2-2 - Geochemistry.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REPOSITORY Postclosure Guidelines § 960.4-2-2 Geochemistry. (a) Qualifying condition. The present and... future, not affect or would favorably affect the ability of the geologic repository to isolate the waste... subjected to expected repository conditions, would remain unaltered or would alter to mineral assemblages...
17 CFR 49.3 - Procedures for registration.
Code of Federal Regulations, 2014 CFR
2014-04-01
... if the Commission finds that such swap data repository is appropriately organized, and has the...) SWAP DATA REPOSITORIES § 49.3 Procedures for registration. (a) Application procedures. (1) An applicant, person or entity desiring to be registered as a swap data repository shall file electronically an...
The Coalition for Publishing Data in the Earth and Space Sciences
NASA Astrophysics Data System (ADS)
Lehnert, Kerstin; Hanson, Brooks; Cutcher-Gershenfeld, Joel
2015-04-01
Scholarly publishing remains a key high-value point in making data available and will for the foreseeable future be tied to the availability of science data. Data need to be included in or released as part of publications to make the science presented in an article reproducible, and most publishers have statements related to the inclusion of data, recognizing that such release enhances the value and is part of the integrity of the research. Unfortunately, practices for reporting and documenting data in the scientific literature are inconsistent and inadequate, and the vast majority of data submitted along with publications is still in formats and forms of storage that make discovery and reuse difficult or impossible. Leading earth and space science repositories on the other hand are eager and set up to provide persistent homes for these data, and also ensure quality, enhancing their value, access, and reusability. Unfortunately only a small fraction of the data associated with scientific publications makes it to these data facilities. Connecting scholarly publication more firmly with data facilities is essential in meeting the expectations of open, accessible and useful data as aspired by all stakeholders and expressed in position statements, policies, and guidelines. To strengthen these connections, a new initiative was launched in Fall 2014 at a conference that brought together major publishers, data facilities, and consortia in the Earth and space sciences, as well as governmental, association, and foundation funders. The aim of this initiative is to foster consensus and consistency among publishers, editors, funders, and data repositories on how data that are part of scholarly publications should be curated and published, and guide the development of practical resources based on those guidelines that will help authors and publishers support open data policies, facilitate proper data archiving, and support the linking of data to publications. The most relevant outcome of the conference is the formation of a working group: Coalition for Publishing Data in the Earth and Space Sciences by publishers and data facilities and consortia that will establish a permanent international coordinating conference on Earth science data publication. Marking the launch of the partnership is a joint statement of commitment (to be be released in January 2015), signed by the major Earth and space science publishers and many data facilities, to ensure that Earth science data will, to the greatest extent possible, be stored in community approved repositories that can provide additional data services. The development of a functional directory of Earth and space science repositories is underway that can be used by journals as part of their information to authors, and by authors to identify rapidly which repositories are the best homes for specific data types and how to structure such deposition.
The Nevada initiative: A risk communication Fiasco
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, J.; Solvic, P.; Mertz, C.K.
The U.S. Congress has designated Yucca Mountain, Nevada as the only potential site to be studied for the nation`s first high-level nuclear waste repository. People in Nevada strongly oppose the program, managed by the U.S. Department of Energy. Survey research shows that the public believes there are great risks from a repository program, in contrast to a majority of scientists who feel the risks are acceptably small. Delays in the repository program resulting in part from public opposition in Nevada have concerned the nuclear power industry, which collects the fees for the federal repository program and believes it needs themore » repository as a final disposal facility for its high-level nuclear wastes. To assist the repository program, the American Nuclear Energy Council (ANEC), an industry group, sponsored a massive advertising campaign in Nevada. The campaign attempted to assure people that the risks of a repository were small and that the repository studies should proceed. The campaign failed because its managers misunderstood the issues underlying the controversy, attempted a covert manipulation of public opinion that was revealed, and most importantly, lacked the public trust that was necessary to communicate credibly about the risks of a nuclear waste facility. This article describes the advertising campaign and its effects. The manner in which the ANEC campaign itself became a controversial public issue is reviewed. The advertising campaign is discussed as it relates to risk assessment and communication. 29 refs., 2 tabs.« less