Sample records for repository source-term arest

  1. The AREST CF experience in biobanking - More than just tissues, tubes and time.

    PubMed

    Garratt, Luke W; Kicic, Anthony; Robertson, Colin; Ranganathan, Sarath; Sly, Peter D; Stick, Stephen M

    2017-09-01

    Research to further improve outcomes for people with CF is dependent upon well characterised, archived and accessible clinical specimens. The recent article by Beekman et al. published in Journal of Cystic Fibrosis summarised a scientific meeting at the 13th ECFS Basic Science Conference. This meeting discussed how well-annotated, clinical biobanks for CF could be established in Europe to meet the needs of therapeutic development. The Australian Respiratory Early Surveillance Team for Cystic Fibrosis (AREST CF) has conducted biobanking of CF research and clinical specimens since the late 1990s and is custodian of the most comprehensive paediatric CF biobank in the world that focuses on the first years of life. This short communication will describe the approach undertaken by AREST CF in establishing a clinical specimen biobank. Copyright © 2017 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  2. Evolution of cystic fibrosis lung function in the early years.

    PubMed

    Bush, Andrew; Sly, Peter D

    2015-11-01

    Most treatment of newborn screening-diagnosed cystic fibrosis is not evidence-based; there are very few randomized controlled trials (RCTs). Furthermore, the advent of novel molecular therapies, which could be started at diagnosis, mandates performing RCTs in very young children. However, unless the natural history of early cystic fibrosis lung disease is known, RCTs are impossible. Here, we review the results of two large prospective cohorts of these infants - London Cystic Fibrosis Collaboration (LCFC) (London, UK) and Australian Respiratory Early Surveillance Team for Cystic Fibrosis (AREST-CF) (Australia). Nutritional status remained excellent in both the cohorts. Both cohorts reported abnormal lung function aged at 3 months. AREST-CF, which previously reported rapidly declining preschool lung function, now report good conventional school-age spirometry. LCFC reported improvement between 3 months and 1 year, and stability in the second year. AREST-CF also reported a high prevalence of high resolution computed tomographic abnormalities related to free neutrophil elastase in bronchoalveolar lavage; LCFC reported high resolution computed tomographic changes at 1 year, which were too mild to be scored reproducibly. At least in the first 2 years of life, lung function is not a good end-point for RCTs; routine bronchoalveolar lavage and HRCT cannot be justified. Newborn screening has greatly improved outcomes, but we need better point-of-care biomarkers.

  3. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  4. Basic repository source term and data sheet report: Lavender Canyon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrada, J.J.

    This report compiles preliminary information that supports the premise that a repository is needed in Latin America and analyzes the nuclear situation (mainly in Argentina and Brazil) in terms of nuclear capabilities, inventories, and regional spent-fuel repositories. The report is based on several sources and summarizes (1) the nuclear capabilities in Latin America and establishes the framework for the need of a permanent repository, (2) the International Atomic Energy Agency (IAEA) approach for a regional spent-fuel repository and describes the support that international institutions are lending to this issue, (3) the current situation in Argentina in order to analyze themore » Argentinean willingness to find a location for a deep geological repository, and (4) the issues involved in selecting a location for the repository and identifies a potential location. This report then draws conclusions based on an analysis of this information. The focus of this report is mainly on spent fuel and does not elaborate on other radiological waste sources.« less

  6. An ontology based information system for the management of institutional repository's collections

    NASA Astrophysics Data System (ADS)

    Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.

    2015-02-01

    In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.

  7. Capture of Aerosols by Iodinated Fiber Media

    DTIC Science & Technology

    2004-09-15

    fibrous media if provided with 70-80% relative humidity and atmospheric dust (Maus et al., 2000). Spore -forming bacteria such as Bacillus anthracis are...States. The anthrax spores sent out during these attacks were classified as being highly concentrated and processed to be disseminated and inhaled...media, and produce more undesirable bioaerosols. This phenomenon has been reported in many studies in heating, ventilation, and air conditioning ( HVAC

  8. Southeast Asia Report

    DTIC Science & Technology

    1987-03-10

    Asian zone, the commercial advisory office in Bejing will serve as the zone center. It will be responsible for overseeing things in China , Korea...that remain include the offices in Washington, DC, Geneva, Brussels, Bejing , Jakarta, Tokyo, and London. At the same time, personnel changes are...States fought in Korea and Vietnam because it wanted to limit communism to China and North Vietnam. But after Vietnam fell and the United States withdrew

  9. On Algorithms for Generating Computationally Simple Piecewise Linear Classifiers

    DTIC Science & Technology

    1989-05-01

    suffers. - Waveform classification, e.g. speech recognition, seismic analysis (i.e. discrimination between earthquakes and nuclear explosions), target...assuming Gaussian distributions (B-G) d) Bayes classifier with probability densities estimated with the k-N-N method (B- kNN ) e) The -arest neighbour...range of classifiers are chosen including a fast, easy computable and often used classifier (B-G), reliable and complex classifiers (B- kNN and NNR

  10. Long-term retrievability and safeguards for immobilized weapons plutonium in geologic storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, P.F.

    1996-05-01

    If plutonium is not ultimately used as an energy source, the quantity of excess weapons plutonium (w-Pu) that would go into a US repository will be small compared to the quantity of plutonium contained in the commercial spent fuel in the repository, and the US repository(ies) will likely be only one (or two) locations out of many around the world where commercial spent fuel will be stored. Therefore excess weapons plutonium creates a small perturbation to the long-term (over 200,000 yr) global safeguard requirements for spent fuel. There are details in the differences between spent fuel and immobilized w-Pu wastemore » forms (i.e. chemical separation methods, utility for weapons, nuclear testing requirements), but these are sufficiently small to be unlikely to play a significant role in any US political decision to rebuild weapons inventories, or to change the long-term risks of theft by subnational groups.« less

  11. Translations on Eastern Europe, Political, Sociological, and Military Affairs. Number 1309

    DTIC Science & Technology

    1976-10-19

    lowered, work with the localities was shirked, no efforts were made to study and resolve with them the tasks stemming from the party’s...34 or "Centrum" department stores the highlights of their visit to the city. "Of Course Everything Is Much Better in Berlin" Food and clothing are...station restrooms, the lack of apples at times when there perhaps are bananas for once or the perpetual "second choice" of all foods for the con- sumer

  12. Quantum Tomography via Compressed Sensing: Error Bounds, Sample Complexity and Efficient Estimators (Open Access, Publisher’s Version)

    DTIC Science & Technology

    2012-09-27

    we require no entangling gates or ancillary systems for the procedure. In contrast with [19], our method is not restricted to processes that are...states, such as those recently developed for use with permutation-invariant states [60], matrix product states [61] or multi-scale entangled states [62...by adjoining an ancilla, preparing the maximally entangled state |ψ0〉, and applying E); then do compressed quantum state tomography on ρE ; see

  13. Research on Geo-information Data Model for Preselected Areas of Geological Disposal of High-level Radioactive Waste

    NASA Astrophysics Data System (ADS)

    Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.

    2016-11-01

    The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.

  14. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... repository possesses the capability to provide adequate long-term curatorial services. 79.9 Section 79.9... FEDERALLY-OWNED AND ADMINISTERED ARCHAEOLOGICAL COLLECTIONS § 79.9 Standards to determine when a repository... shall determine that a repository has the capability to provide adequate long-term curatorial services...

  15. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    PubMed

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Institutional Repositories as Infrastructures for Long-Term Preservation

    ERIC Educational Resources Information Center

    Francke, Helena; Gamalielsson, Jonas; Lundell, Björn

    2017-01-01

    Introduction: The study describes the conditions for long-term preservation of the content of the institutional repositories of Swedish higher education institutions based on an investigation of how deposited files are managed with regards to file format and how representatives of the repositories describe the functions of the repositories.…

  17. XRD Analysis of Cement Paste Samples Exposed to the Simulated Environment of a Deep Repository - 12239

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira, Eduardo G.A.; Marumo, Julio T.; Vicente, Roberto

    2012-07-01

    Portland cement materials are widely used as engineered barriers in repositories for radioactive waste. The capacity of such barriers to avoid the disposed of radionuclides to entering the biosphere in the long-term depends on the service life of those materials. Thus, the performance assessment of structural materials under a series of environmental conditions prevailing at the environs of repositories is a matter of interest. The durability of cement paste foreseen as backfill in a deep borehole for disposal of disused sealed radioactive sources is investigated in the development of the repository concept. Results are intended to be part of themore » body of evidence in the safety case of the proposed disposal technology. This paper presents the results of X-Ray Diffraction (XRD) Analysis of cement paste exposed to varying temperatures and simulated groundwater after samples received the radiation dose that the cement paste will accumulate until complete decay of the radioactive sources. The XRD analysis of cement paste samples realized in this work allowed observing some differences in the results of cement paste specimens that were submitted to different treatments. The cluster analysis of results was able to group tested samples according to the applied treatments. Mineralogical differences, however, are tenuous and, apart from ettringite, are hardly observed. The absence of ettringite in all the seven specimens that were kept in dry storage at high temperature had hardly occurred by natural variations in the composition of hydrated cement paste because ettringite is observed in all tested except the seven specimens. Therefore this absence is certainly the result of the treatments and could be explained by the decomposition of ettringite. Although the temperature of decomposition is about 110-120 deg. C, it may be initially decomposed to meta-ettringite, an amorphous compound, above 50 deg. C in the absence of water. Influence of irradiation on the mineralogical composition was not observed when the treatment was analyzed individually or when analyzed under the possible synergic effect with other treatments. However, the radiation dose to which specimens were exposed is only a fraction of the accumulated dose in cement paste until complete decay of some sources. Therefore, in the short term, the conditions deemed to prevail in the repository environment may not influence the properties of cement paste at detectable levels. Under the conditions presented in this work, it is not possible to predict the long term evolution of these properties. (authors)« less

  18. 76 FR 53454 - Privacy Act System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... statutory responsibilities of the OIG; and Acting as a repository and source for information necessary to... in matters relating to the statutory responsibilities of the OIG; and 7. Acting as a repository and.... Acting as a repository and source for information necessary to fulfill the reporting requirements of the...

  19. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    NASA Astrophysics Data System (ADS)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  20. The role of the Jotello F. Soga Library in the digital preservation of South African veterinary history.

    PubMed

    Breytenbach, Amelia; Lourens, Antoinette; Marsh, Susan

    2013-04-26

    The history of veterinary science in South Africa can only be appreciated, studied, researched and passed on to coming generations if historical sources are readily available. In most countries, material and sources with historical value are often difficult to locate, dispersed over a large area and not part of the conventional book and journal literature. The Faculty of Veterinary Science of the University of Pretoria and its library has access to a large collection of historical sources. The collection consists of photographs, photographic slides, documents, proceedings, posters, audio-visual material, postcards and other memorabilia. Other institutions in the country are also approached if relevant sources are identified in their collections. The University of Pretoria's institutional repository, UPSpace, was launched in 2006. This provided the Jotello F. Soga Library with the opportunity to fill the repository with relevant digitised collections of diverse heritage and learning resources that can contribute to the long-term preservation and accessibility of historical veterinary sources. These collections are available for use not only by historians and researchers in South Africa but also elsewhere in Africa and the rest of the world. Important historical collections such as the Arnold Theiler collection, the Jotello F. Soga collection and collections of the Onderstepoort Journal of Veterinary Research and the Journal of the South African Veterinary Association are highlighted. The benefits of an open access digital repository, the importance of collaboration across the veterinary community and other prerequisites for the sustainability of a digitisation project and the importance of metadata to enhance accessibility are covered.

  1. Probabilistic estimation of long-term volcanic hazard under evolving tectonic conditions in a 1 Ma timeframe

    NASA Astrophysics Data System (ADS)

    Jaquet, O.; Lantuéjoul, C.; Goto, J.

    2017-10-01

    Risk assessments in relation to the siting of potential deep geological repositories for radioactive wastes demand the estimation of long-term tectonic hazards such as volcanicity and rock deformation. Owing to their tectonic situation, such evaluations concern many industrial regions around the world. For sites near volcanically active regions, a prevailing source of uncertainty is related to volcanic hazard. For specific situations, in particular in relation to geological repository siting, the requirements for the assessment of volcanic and tectonic hazards have to be expanded to 1 million years. At such time scales, tectonic changes are likely to influence volcanic hazard and therefore a particular stochastic model needs to be developed for the estimation of volcanic hazard. The concepts and theoretical basis of the proposed model are given and a methodological illustration is provided using data from the Tohoku region of Japan.

  2. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  3. Repositories for Deep, Dark, and Offline Data - Building Grey Literature Repositories and Discoverability

    NASA Astrophysics Data System (ADS)

    Keane, C. M.; Tahirkheli, S.

    2017-12-01

    Data repositories, especially in the geosciences, have been focused on the management of large quantities of born-digital data and facilitating its discovery and use. Unfortunately, born-digital data, even with its immense scale today, represents only the most recent data acquisitions, leaving a large proportion of the historical data record of the science "out in the cold." Additionally, the data record in the peer-reviewed literature, whether captured directly in the literature or through the journal data archive, represents only a fraction of the reliable data collected in the geosciences. Federal and state agencies, state surveys, and private companies, collect vast amounts of geoscience information and data that is not only reliable and robust, but often the only data representative of specific spatial and temporal conditions. Likewise, even some academic publications, such as senior theses, are unique sources of data, but generally do not have wide discoverability nor guarantees of longevity. As more of these `grey' sources of information and data are born-digital, they become increasingly at risk for permanent loss, not to mention poor discoverability. Numerous studies have shown that grey literature across all disciplines, including geosciences, disappears at a rate of about 8% per year. AGI has been working to develop systems to both improve the discoverability and the preservation of the geoscience grey literature by coupling several open source platforms from the information science community. We will detail the rationale, the technical and legal frameworks for these systems, and the long-term strategies for improving access, use, and stability of these critical data sources.

  4. Researcher-library collaborations: Data repositories as a service for researchers.

    PubMed

    Gordon, Andrew S; Millman, David S; Steiger, Lisa; Adolph, Karen E; Gilmore, Rick O

    New interest has arisen in organizing, preserving, and sharing the raw materials-the data and metadata-that undergird the published products of research. Library and information scientists have valuable expertise to bring to bear in the effort to create larger, more diverse, and more widely used data repositories. However, for libraries to be maximally successful in providing the research data management and preservation services required of a successful data repository, librarians must work closely with researchers and learn about their data management workflows. Databrary is a data repository that is closely linked to the needs of a specific scholarly community-researchers who use video as a main source of data to study child development and learning. The project's success to date is a result of its focus on community outreach and providing services for scholarly communication, engaging institutional partners, offering services for data curation with the guidance of closely involved information professionals, and the creation of a strong technical infrastructure. Databrary plans to improve its curation tools that allow researchers to deposit their own data, enhance the user-facing feature set, increase integration with library systems, and implement strategies for long-term sustainability.

  5. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  6. 75 FR 63080 - Interim Final Rule for Reporting Pre-Enactment Swap Transactions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... registered swap data repository (``SDR'') \\1\\ or to the Commission by the compliance date to be established... to the terms of such swaps. \\1\\ The term ``swap data repository'' is defined in Section 1a(48) of the... the date of the enactment of this subsection shall be reported to a registered swap data repository or...

  7. Scaling an expert system data mart: more facilities in real-time.

    PubMed

    McNamee, L A; Launsby, B D; Frisse, M E; Lehmann, R; Ebker, K

    1998-01-01

    Clinical Data Repositories are being rapidly adopted by large healthcare organizations as a method of centralizing and unifying clinical data currently stored in diverse and isolated information systems. Once stored in a clinical data repository, healthcare organizations seek to use this centralized data to store, analyze, interpret, and influence clinical care, quality and outcomes. A recent trend in the repository field has been the adoption of data marts--specialized subsets of enterprise-wide data taken from a larger repository designed specifically to answer highly focused questions. A data mart exploits the data stored in the repository, but can use unique structures or summary statistics generated specifically for an area of study. Thus, data marts benefit from the existence of a repository, are less general than a repository, but provide more effective and efficient support for an enterprise-wide data analysis task. In previous work, we described the use of batch processing for populating data marts directly from legacy systems. In this paper, we describe an architecture that uses both primary data sources and an evolving enterprise-wide clinical data repository to create real-time data sources for a clinical data mart to support highly specialized clinical expert systems.

  8. A Prototype Performance Assessment Model for Generic Deep Borehole Repository for High-Level Nuclear Waste - 12132

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Arnold, Bill W.; Swift, Peter N.

    2012-07-01

    A deep borehole repository is one of the four geologic disposal system options currently under study by the U.S. DOE to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic deep borehole repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a deep borehole. A prototype performance assessment model for a generic deep borehole repository has been developed using themore » approach for a mined geological repository. The preliminary results from the simplified deep borehole generic repository performance assessment indicate that soluble, non-sorbing (or weakly sorbing) fission product radionuclides, such as I-129, Se-79 and Cl-36, are the likely major dose contributors, and that the annual radiation doses to hypothetical future humans associated with those releases may be extremely small. While much work needs to be done to validate the model assumptions and parameters, these preliminary results highlight the importance of a robust seal design in assuring long-term isolation, and suggest that deep boreholes may be a viable alternative to mined repositories for disposal of both HLW and UNF. (authors)« less

  9. 77 FR 26709 - Swap Data Repositories: Interpretative Statement Regarding the Confidentiality and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 49 RIN 3038-AD83 Swap Data Repositories... data repositories (``SDRs'').SDRs are new registered entities created by section 728 of the Dodd-Frank... Act amends section 1a of the CEA to add a definition of the term ``swap data repository.'' Pursuant to...

  10. Web Based Autonomous Geophysical/Hydrological Monitoring of the Gilt Edge Mine Site: Implementation and Results

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Wangerud, K.; Mattson, E.; Ankeny, M.; Richardson, A.; Heath, G.

    2005-05-01

    The Ruby Gulch repository at the Gilt Edge Mine Superfund site is a capped waste rock repository. Early in the system design EPA and its subcontractor, Bureau of Reclamation, recognized the need for long-term monitoring system to provide information on the repository behavior with the following objectives: 1 Provide information on the integrity of the newly constructed surface cover and diversion system 2 Continually assess the waste's hydrological and geochemical behavior, such that rational decisions can be made for the operation of this cover and liner system 3 Easily access of information pertaining to the system performance to stakeholders 4 Integration of a variety of data sources to produce information which could be used to enhance future cover designs. Through discussions between EPA, the Bureau of Reclamation and Idaho National Laboratory a long-term monitoring system was designed and implemented allowing EPA to meet these objectives. This system was designed to provide a cost effective way to deal with massive amounts of data and information, subject to the following specifications: 1 Data acquisition should occur autonomously and automatically, 2 Data management, processing and presentation should be automated as much as possible, 3 Users should be able to access all data and information remotely through a web browser. The INL long-term monitoring system integrates the data from a set of 522 electrodes resistivity electrodes consisting of 462 surface electrodes and 60 borehole electrodes (in 4 wells with 15 electrodes each), an outflow meter at the toe of the repository, an autonomous, remotely accessible weather station, and four wells (average depths of 250 feet) with thermocouples, pressure transducers and sampling ports for water and air. The monitoring system has currently been in operation for over a year, and has collected data continuously over this period. Results from this system have shown both the diurnal variation in rockmass behavior, movement of water through the waste (allowing estimated in residence time) and are leading to a comprehensive model of the repository behavior. Due to the sheer volume of data, a user driven interface allows users to create their own views of the different datasets.

  11. Rolling Deck to Repository (R2R): Linking and Integrating Data for Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Chandler, C. L.; Clark, P. D.; Shepherd, A.; Moore, C.

    2012-12-01

    The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from NSF-supported oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. We have published the entire R2R Catalog as a Linked Data collection, making it easily accessible to encourage linking and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by providing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation. We are leveraging or adopting existing community-standard concepts and vocabularies, particularly concepts from the Biological and Chemical Oceanography Data Management Office (BCO-DMO) ontology and terms from the pan-European SeaDataNet vocabularies, and continually re-publish resources as new concepts and terms are mapped. 2.) We facilitate data citation through the entire data lifecycle from field acquisition to shoreside archiving to (ultimately) global syntheses and journal articles. We are implementing globally unique and persistent identifiers at the collection, dataset, and granule levels, and encoding these citable identifiers directly into the Linked Data resources. 3.) We facilitate linking and integration with other repositories that publish Linked Data collections for the U.S. academic fleet, such as BCO-DMO and the Index to Marine and Lacustrine Geological Samples (IMLGS). We are initially mapping datasets at the resource level, and plan to eventually implement rule-based mapping at the concept level. We work collaboratively with partner repositories to develop best practices for URI patterns and consensus on shared vocabularies. The R2R Linked Data collection is implemented as a lightweight "virtual RDF graph" generated on-the-fly from our SQL database using the D2RQ (http://d2rq.org) package. In addition to the default SPARQL endpoint for programmatic access, we are developing a Web-based interface from open-source software components that offers user-friendly browse and search.

  12. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  13. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  14. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  15. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  16. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  17. 76 FR 81950 - Privacy Act; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...

  18. 10 CFR 60.22 - Filing and distribution of application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Licenses License Applications § 60.22 Filing and distribution of application. (a) An application for a construction authorization for a high-level radioactive waste repository at a geologic repository operations area, and an application for a license to receive and possess source, special nuclear...

  19. A perspective on the proliferation risks of plutonium mines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyman, E.S.

    1996-05-01

    The program of geologic disposal of spent fuel and other plutonium-containing materials is increasingly becoming the target of criticism by individuals who argue that in the future, repositories may become low-cost sources of fissile material for nuclear weapons. This paper attempts to outline a consistent framework for analyzing the proliferation risks of these so-called {open_quotes}plutonium mines{close_quotes} and putting them into perspective. First, it is emphasized that the attractiveness of plutonium in a repository as a source of weapons material depends on its accessibility relative to other sources of fissile material. Then, the notion of a {open_quotes}material production standard{close_quotes} (MPS) ismore » proposed: namely, that the proliferation risks posed by geologic disposal will be acceptable if one can demonstrate, under a number of reasonable scenarios, that the recovery of plutonium from a repository is likely to be as difficult as new production of fissile material. A preliminary analysis suggests that the range of circumstances under which current mined repository concepts would fail to meet this standard is fairly narrow. Nevertheless, a broad application of the MPS may impose severe restrictions on repository design. In this context, the relationship of repository design parameters to easy of recovery is discussed.« less

  20. Automating RPM Creation from a Source Code Repository

    DTIC Science & Technology

    2012-02-01

    apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make

  1. Audit of a Scientific Data Center for Certification as a Trustworthy Digital Repository: A Case Study

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2011-12-01

    Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.

  2. Numerical Modeling of Thermal-Hydrology in the Near Field of a Generic High-Level Waste Repository

    NASA Astrophysics Data System (ADS)

    Matteo, E. N.; Hadgu, T.; Park, H.

    2016-12-01

    Disposal in a deep geologic repository is one of the preferred option for long term isolation of high-level nuclear waste. Coupled thermal-hydrologic processes induced by decay heat from the radioactive waste may impact fluid flow and the associated migration of radionuclides. This study looked at the effects of those processes in simulations of thermal-hydrology for the emplacement of U. S. Department of Energy managed high-level waste and spent nuclear fuel. Most of the high-level waste sources have lower thermal output which would reduce the impact of thermal propagation. In order to quantify the thermal limits this study concentrated on the higher thermal output sources and on spent nuclear fuel. The study assumed a generic nuclear waste repository at 500 m depth. For the modeling a representative domain was selected representing a portion of the repository layout in order to conduct a detailed thermal analysis. A highly refined unstructured mesh was utilized with refinements near heat sources and at intersections of different materials. Simulations looked at different values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock). The simulations also looked at the effects of different durations of surface aging of the waste to reduce thermal perturbations. The PFLOTRAN code (Hammond et al., 2014) was used for the simulations. Modeling results for the different options are reported and include temperature and fluid flow profiles in the near field at different simulation times. References:G. E. Hammond, P.C. Lichtner and R.T. Mills, "Evaluating the Performance of Parallel Subsurface Simulators: An Illustrative Example with PFLOTRAN", Water Resources Research, 50, doi:10.1002/2012WR013483 (2014). Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2016-7510 A

  3. 10 CFR 60.41 - Standards for issuance of a license.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Licenses License Issuance and Amendment § 60.41 Standards for issuance of a license. A license to receive and possess source, special nuclear, or byproduct material at a geologic repository operations area may be issued by the Commission upon finding that: (a) Construction of the geologic repository...

  4. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following authorization to receive and possess source, special nuclear, or byproduct material at a geologic repository operations area, the DOE may (i) make changes in the geologic repository operations area as described in the...

  5. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  6. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    WANG,YIFENG; XU,HUIFANG

    Correctly identifying the possible alteration products and accurately predicting their occurrence in a repository-relevant environment are the key for the source-term calculation in a repository performance assessment. Uraninite in uranium deposits has long been used as a natural analog to spent fuel in a repository because of their chemical and structural similarity. In this paper, a SEM/AEM investigation has been conducted on a partially alternated uraninite sample from a uranium ore deposit of Shinkolobwe of Congo. The mineral formation sequences were identified: uraninite {yields} uranyl hydrates {yields} uranyl silicates {yields} Ca-uranyl silicates or uraninite {yields} uranyl silicates {yields} Ca-uranyl silicates.more » Reaction-path calculations were conducted for the oxidative dissolution of spent fuel in a representative Yucca Mountain groundwater. The predicted sequence is in general consistent with the SEM observations. The calculations also show that uranium carbonate minerals are unlikely to become major solubility-controlling mineral phases in a Yucca Mountain environment. Some discrepancies between model predictions and field observations are observed. Those discrepancies may result from poorly constrained thermodynamic data for uranyl silicate minerals.« less

  8. ODMedit: uniform semantic annotation for data integration in medicine based on a public metadata repository.

    PubMed

    Dugas, Martin; Meidt, Alexandra; Neuhaus, Philipp; Storck, Michael; Varghese, Julian

    2016-06-01

    The volume and complexity of patient data - especially in personalised medicine - is steadily increasing, both regarding clinical data and genomic profiles: Typically more than 1,000 items (e.g., laboratory values, vital signs, diagnostic tests etc.) are collected per patient in clinical trials. In oncology hundreds of mutations can potentially be detected for each patient by genomic profiling. Therefore data integration from multiple sources constitutes a key challenge for medical research and healthcare. Semantic annotation of data elements can facilitate to identify matching data elements in different sources and thereby supports data integration. Millions of different annotations are required due to the semantic richness of patient data. These annotations should be uniform, i.e., two matching data elements shall contain the same annotations. However, large terminologies like SNOMED CT or UMLS don't provide uniform coding. It is proposed to develop semantic annotations of medical data elements based on a large-scale public metadata repository. To achieve uniform codes, semantic annotations shall be re-used if a matching data element is available in the metadata repository. A web-based tool called ODMedit ( https://odmeditor.uni-muenster.de/ ) was developed to create data models with uniform semantic annotations. It contains ~800,000 terms with semantic annotations which were derived from ~5,800 models from the portal of medical data models (MDM). The tool was successfully applied to manually annotate 22 forms with 292 data items from CDISC and to update 1,495 data models of the MDM portal. Uniform manual semantic annotation of data models is feasible in principle, but requires a large-scale collaborative effort due to the semantic richness of patient data. A web-based tool for these annotations is available, which is linked to a public metadata repository.

  9. Terminology development towards harmonizing multiple clinical neuroimaging research repositories.

    PubMed

    Turner, Jessica A; Pasquerello, Danielle; Turner, Matthew D; Keator, David B; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D; Potkin, Steven G; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-07-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories.

  10. Terminology development towards harmonizing multiple clinical neuroimaging research repositories

    PubMed Central

    Turner, Jessica A.; Pasquerello, Danielle; Turner, Matthew D.; Keator, David B.; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D.; Potkin, Steven G.; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-01-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories. PMID:26688838

  11. Source and long-term behavior of transuranic aerosols in the WIPP environment.

    PubMed

    Thakur, P; Lemons, B G

    2016-10-01

    Source and long-term behavior transuranic aerosols ((239+240)Pu, (238)Pu, and (241)Am) in the ambient air samples collected at and near the Waste Isolation Pilot Plant (WIPP) deep geologic repository site were investigated using historical data from an independent monitoring program conducted by the Carlsbad Environmental Monitoring and Research Center and an oversight monitoring program conducted by the management and operating contractor for WIPP at and near the facility. An analysis of historical data indicates frequent detections of (239+240)Pu and (241)Am, whereas (238)Pu is detected infrequently. Peaks in (239+240)Pu and (241)Am concentrations in ambient air generally occur from March to June timeframe, which is when strong and gusty winds in the area frequently give rise to blowing dust. Long-term measurements of plutonium isotopes (1985-2015) in the WIPP environment suggest that the resuspension of previously contaminated soils is likely the primary source of plutonium in the ambient air samples from WIPP and its vicinity. There is no evidence that WIPP is a source of environmental contamination that can be considered significant by any health-based standard.

  12. Building Specialized Multilingual Lexical Graphs Using Community Resources

    NASA Astrophysics Data System (ADS)

    Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud

    We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.

  13. Organizing Scientific Data Sets: Studying Similarities and Differences in Metadata and Subject Term Creation

    ERIC Educational Resources Information Center

    White, Hollie C.

    2012-01-01

    Background: According to Salo (2010), the metadata entered into repositories are "disorganized" and metadata schemes underlying repositories are "arcane". This creates a challenging repository environment in regards to personal information management (PIM) and knowledge organization systems (KOSs). This dissertation research is…

  14. TRAC Searchable Research Library

    DTIC Science & Technology

    2016-05-01

    network accessible document repository for technical documents and similar document artifacts. We used a model-based approach using the Vector...demonstration and model refinement. 14. SUBJECT TERMS Knowledge Management, Document Repository , Digital Library, Vector Directional Data Model...27 Figure D1. Administrator Repository Upload Page. ................................................................... D-2 Figure D2

  15. Core Certification of Data Repositories: Trustworthiness and Long-Term Stewardship

    NASA Astrophysics Data System (ADS)

    de Sherbinin, A. M.; Mokrane, M.; Hugo, W.; Sorvari, S.; Harrison, S.

    2017-12-01

    Scientific integrity and norms dictate that data created and used by scientists should be managed, curated, and archived in trustworthy data repositories thus ensuring that science is verifiable and reproducible while preserving the initial investment in collecting data. Research stakeholders including researchers, science funders, librarians, and publishers must also be able to establish the trustworthiness of data repositories they use to confirm that the data they submit and use remain useful and meaningful in the long term. Data repositories are increasingly recognized as a key element of the global research infrastructure and the importance of establishing their trustworthiness is recognised as a prerequisite for efficient scientific research and data sharing. The Core Trustworthy Data Repository Requirements are a set of universal requirements for certification of data repositories at the core level (see: https://goo.gl/PYsygW). They were developed by the ICSU World Data System (WDS: www.icsu-wds.org) and the Data Seal of Approval (DSA: www.datasealofapproval.org)—the two authoritative organizations responsible for the development and implementation of this standard to be further developed under the CoreTrustSeal branding . CoreTrustSeal certification of data repositories involves a minimally intensive process whereby repositories supply evidence that they are sustainable and trustworthy. Repositories conduct a self-assessment which is then reviewed by community peers. Based on this review CoreTrustSeal certification is granted by the CoreTrustSeal Standards and Certification Board. Certification helps data communities—producers, repositories, and consumers—to improve the quality and transparency of their processes, and to increase awareness of and compliance with established standards. This presentation will introduce the CoreTrustSeal certification requirements for repositories and offer an opportunity to discuss ways to improve the contribution of certified data repositories to sustain open data for open scientific research.

  16. Spent fuel radionuclide source-term model for assessing spent fuel performance in geological disposal. Part I: Assessment of the instant release fraction

    NASA Astrophysics Data System (ADS)

    Johnson, Lawrence; Ferry, Cécile; Poinssot, Christophe; Lovera, Patrick

    2005-11-01

    A source-term model for the short-term release of radionuclides from spent nuclear fuel (SNF) has been developed. It provides quantitative estimates of the fraction of various radionuclides that are expected to be released rapidly (the instant release fraction, or IRF) when water contacts the UO 2 or MOX fuel after container breaching in a geological repository. The estimates are based on correlation of leaching data for radionuclides with fuel burnup and fission gas release. Extrapolation of the data to higher fuel burnup values is based on examination of data on fuel restructuring, such as rim development, and on fission gas release data, which permits bounding IRF values to be estimated assuming that radionuclide releases will be less than fission gas release. The consideration of long-term solid-state changes influencing the IRF prior to canister breaching is addressed by evaluating alpha self-irradiation enhanced diffusion, which may gradually increase the accumulation of fission products at grain boundaries.

  17. The Athabasca University eduSource Project: Building an Accessible Learning Object Repository

    ERIC Educational Resources Information Center

    Cleveland-Innes, Martha; McGreal, Rory; Anderson, Terry; Friesen, Norm; Ally, Mohamed; Tin, Tony; Graham, Rodger; Moisey, Susan; Petrinjak, Anita; Schafer, Steve

    2005-01-01

    Athabasca University--Canada's Open University (AU) made the commitment to put all of its courses online as part of its Strategic University Plan. In pursuit of this goal, AU participated in the eduSource project, a pan-Canadian effort to build the infrastructure for an interoperable network of learning object repositories. AU acted as a leader in…

  18. Enriching text with images and colored light

    NASA Astrophysics Data System (ADS)

    Sekulovski, Dragan; Geleijnse, Gijs; Kater, Bram; Korst, Jan; Pauws, Steffen; Clout, Ramon

    2008-01-01

    We present an unsupervised method to enrich textual applications with relevant images and colors. The images are collected by querying large image repositories and subsequently the colors are computed using image processing. A prototype system based on this method is presented where the method is applied to song lyrics. In combination with a lyrics synchronization algorithm the system produces a rich multimedia experience. In order to identify terms within the text that may be associated with images and colors, we select noun phrases using a part of speech tagger. Large image repositories are queried with these terms. Per term representative colors are extracted using the collected images. Hereto, we either use a histogram-based or a mean shift-based algorithm. The representative color extraction uses the non-uniform distribution of the colors found in the large repositories. The images that are ranked best by the search engine are displayed on a screen, while the extracted representative colors are rendered on controllable lighting devices in the living room. We evaluate our method by comparing the computed colors to standard color representations of a set of English color terms. A second evaluation focuses on the distance in color between a queried term in English and its translation in a foreign language. Based on results from three sets of terms, a measure of suitability of a term for color extraction based on KL Divergence is proposed. Finally, we compare the performance of the algorithm using either the automatically indexed repository of Google Images and the manually annotated Flickr.com. Based on the results of these experiments, we conclude that using the presented method we can compute the relevant color for a term using a large image repository and image processing.

  19. New directions in medical e-curricula and the use of digital repositories.

    PubMed

    Fleiszer, David M; Posel, Nancy H; Steacy, Sean P

    2004-03-01

    Medical educators involved in the growth of multimedia-enhanced e-curricula are increasingly aware of the need for digital repositories to catalogue, store and ensure access to learning objects that are integrated within their online material. The experience at the Faculty of Medicine at McGill University during initial development of a mainstream electronic curriculum reflects this growing recognition that repositories can facilitate the development of a more comprehensive as well as effective electronic curricula. Also, digital repositories can help to ensure efficient utilization of resources through the use, re-use, and reprocessing of multimedia learning, addressing the potential for collaboration among repositories and increasing available material exponentially. The authors review different approaches to the development of a digital repository application, as well as global and specific issues that should be examined in the initial requirements definition and development phase, to ensure current initiatives meet long-term requirements. Often, decisions regarding creation of e-curricula and associated digital repositories are left to interested faculty and their individual development teams. However, the development of an e-curricula and digital repository is not predominantly a technical exercise, but rather one that affects global pedagogical strategies and curricular content and involves a commitment of large-scale resources. Outcomes of these decisions can have long-term consequences and as such, should involve faculty at the highest levels including the dean.

  20. From a Content Delivery Portal to a Knowledge Management System for Standardized Cancer Documentation.

    PubMed

    Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard

    2017-01-01

    Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.

  1. 75 FR 78892 - Reporting Certain Post-Enactment Swap Transactions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-17

    ...)--to a registered swap data repository (``SDR'') or to the Commission. Each category of data is subject... expired by that date. \\5\\ The term ``swap data repository'' is defined in Section 1a(48) of the CEA to... date of the enactment of this subsection shall be reported to a registered swap data repository or the...

  2. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  3. Digital Repositories and the Question of Data Usefulness

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2017-12-01

    The advent of ISO standards for trustworthy long-term digital repositories provides both a set of principles to develop long-term data repositories and the instruments to assess them for trustworthiness. Such mandatory high-level requirements are broad enough to be achievable, to some extent, by many scientific data centers, archives, and other repositories. But the requirement that the data be useful in the future, the requirement that is usually considered to be most relevant to the value of the repository for its user communities, largely remains subject to various interpretations and misunderstanding. However, current and future users will be relying on repositories to preserve and disseminate the data and information needed to discover, understand, and utilize these resources to support their research, learning, and decision-making objectives. Therefore, further study is needed to determine the approaches that can be adopted by repositories to make data useful to future communities of users. This presentation will describe approaches for enabling scientific data and related information, such as software, to be useful for current and potential future user communities and will present the methodology chosen to make one science discipline's data useful for both current and future users. The method uses an ontology-based information model to define and capture the information necessary to make the data useful for contemporary and future users.

  4. [Random Variable Read Me File

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Sankararaman, Shankar; Cullo, Aiden

    2017-01-01

    Readme for the Random Variable Toolbox usable manner. is a Web-based Git version control repository hosting service. It is mostly used for computer code. It offers all of the distributed version control and source code management (SCM) functionality of Git as well as adding its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project.[3] GitHub offers both plans for private and free repositories on the same account[4] which are commonly used to host open-source software projects.[5] As of April 2017, GitHub reports having almost 20 million users and 57 million repositories,[6] making it the largest host of source code in the world.[7] GitHub has a mascot called Octocat, a cat with five tentacles and a human-like face

  5. Repository contributions to Rubus research

    USDA-ARS?s Scientific Manuscript database

    The USDA National Plant Germplasm System is a nation-wide source for global genetic resources. The National Clonal Germplasm Repository (NCGR) in Corvallis, OR, maintains crops and crop wild relatives for the Willamette Valley including pear, raspberry and blackberry, strawberry, blueberry, gooseber...

  6. SeisCode: A seismological software repository for discovery and collaboration

    NASA Astrophysics Data System (ADS)

    Trabant, C.; Reyes, C. G.; Clark, A.; Karstens, R.

    2012-12-01

    SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their software packages. IRIS invites the seismological community to browse and to submit projects to https://seiscode.iris.washington.edu/

  7. 75 FR 33312 - Indexing Structured Product Labeling for Human Prescription Drug and Biological Products; Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-11

    ... for use as terms to search repositories of approved prescription medical product structured product... access repository. Considering FDA's available resources, we have instituted a phased implementation of...

  8. The Tropical and Subtropical Germplasm Repositories of The National Germplasm System

    USDA-ARS?s Scientific Manuscript database

    Germplasm collections are viewed as a source of genetic diversity to support crop improvement and agricultural research, and germplasm conservation efforts. The United States Department of Agriculture's National Plant Germplasm Repository System (NPGS) is responsible for administering plant genetic ...

  9. Making research data repositories visible: the re3data.org Registry.

    PubMed

    Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe

    2013-01-01

    Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org-Registry of Research Data Repositories-has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data.

  10. Traits and types of health data repositories.

    PubMed

    Wade, Ted D

    2014-01-01

    We review traits of reusable clinical data and offer a typology of clinical repositories with a range of known examples. Sources of clinical data suitable for research can be classified into types reflecting the data's institutional origin, original purpose, level of integration and governance. Primary data nearly always come from research studies and electronic medical records. Registries collect data on focused populations primarily to track outcomes, often using observational research methods. Warehouses are institutional information utilities repackaging clinical care data. Collections organize data from more organizations than a data warehouse, and more original data sources than a registry. Therefore even if they are heavily curated, their level of internal integration, and thus ease of use, can be less than other types. Federations are like collections except that physical control over data is distributed among donor organizations. Federations sometimes federate, giving a second level of organization. While the size, in number of patients, varies widely within each type of data source, populations over 10 K are relatively numerous, and much larger populations can be seen in warehouses and federations. One imagined ideal structure for research progress has been called an "Information Commons". It would have longitudinal, multi-leveled (environmental through molecular) data on a large population of identified, consenting individuals. These are qualities whose achievement would require long term commitment on the part of many data donors, including a willingness to make their data public.

  11. Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository

    PubMed Central

    Cimino, James J.; Remennick, Lyubov

    2014-01-01

    Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344

  12. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  13. Food entries in a large allergy data repository

    PubMed Central

    Plasek, Joseph M.; Goss, Foster R.; Lai, Kenneth H.; Lau, Jason J.; Seger,, Diane L.; Blumenthal, Kimberly G.; Wickner, Paige G.; Slight, Sarah P.; Chang, Frank Y.; Topaz, Maxim; Bates, David W.

    2016-01-01

    Objective Accurate food adverse sensitivity documentation in electronic health records (EHRs) is crucial to patient safety. This study examined, encoded, and grouped foods that caused any adverse sensitivity in a large allergy repository using natural language processing and standard terminologies. Methods Using the Medical Text Extraction, Reasoning, and Mapping System (MTERMS), we processed both structured and free-text entries stored in an enterprise-wide allergy repository (Partners’ Enterprise-wide Allergy Repository), normalized diverse food allergen terms into concepts, and encoded these concepts using the Systematized Nomenclature of Medicine – Clinical Terms (SNOMED-CT) and Unique Ingredient Identifiers (UNII) terminologies. Concept coverage also was assessed for these two terminologies. We further categorized allergen concepts into groups and calculated the frequencies of these concepts by group. Finally, we conducted an external validation of MTERMS’s performance when identifying food allergen terms, using a randomized sample from a different institution. Results We identified 158 552 food allergen records (2140 unique terms) in the Partners repository, corresponding to 672 food allergen concepts. High-frequency groups included shellfish (19.3%), fruits or vegetables (18.4%), dairy (9.0%), peanuts (8.5%), tree nuts (8.5%), eggs (6.0%), grains (5.1%), and additives (4.7%). Ambiguous, generic concepts such as “nuts” and “seafood” accounted for 8.8% of the records. SNOMED-CT covered more concepts than UNII in terms of exact (81.7% vs 68.0%) and partial (14.3% vs 9.7%) matches. Discussion Adverse sensitivities to food are diverse, and existing standard terminologies have gaps in their coverage of the breadth of allergy concepts. Conclusion New strategies are needed to represent and standardize food adverse sensitivity concepts, to improve documentation in EHRs. PMID:26384406

  14. Food entries in a large allergy data repository.

    PubMed

    Plasek, Joseph M; Goss, Foster R; Lai, Kenneth H; Lau, Jason J; Seger, Diane L; Blumenthal, Kimberly G; Wickner, Paige G; Slight, Sarah P; Chang, Frank Y; Topaz, Maxim; Bates, David W; Zhou, Li

    2016-04-01

    Accurate food adverse sensitivity documentation in electronic health records (EHRs) is crucial to patient safety. This study examined, encoded, and grouped foods that caused any adverse sensitivity in a large allergy repository using natural language processing and standard terminologies. Using the Medical Text Extraction, Reasoning, and Mapping System (MTERMS), we processed both structured and free-text entries stored in an enterprise-wide allergy repository (Partners' Enterprise-wide Allergy Repository), normalized diverse food allergen terms into concepts, and encoded these concepts using the Systematized Nomenclature of Medicine - Clinical Terms (SNOMED-CT) and Unique Ingredient Identifiers (UNII) terminologies. Concept coverage also was assessed for these two terminologies. We further categorized allergen concepts into groups and calculated the frequencies of these concepts by group. Finally, we conducted an external validation of MTERMS's performance when identifying food allergen terms, using a randomized sample from a different institution. We identified 158 552 food allergen records (2140 unique terms) in the Partners repository, corresponding to 672 food allergen concepts. High-frequency groups included shellfish (19.3%), fruits or vegetables (18.4%), dairy (9.0%), peanuts (8.5%), tree nuts (8.5%), eggs (6.0%), grains (5.1%), and additives (4.7%). Ambiguous, generic concepts such as "nuts" and "seafood" accounted for 8.8% of the records. SNOMED-CT covered more concepts than UNII in terms of exact (81.7% vs 68.0%) and partial (14.3% vs 9.7%) matches. Adverse sensitivities to food are diverse, and existing standard terminologies have gaps in their coverage of the breadth of allergy concepts. New strategies are needed to represent and standardize food adverse sensitivity concepts, to improve documentation in EHRs. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Monte Carlo simulations for generic granite repository studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Shaoping; Lee, Joon H; Wang, Yifeng

    In a collaborative study between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) for the DOE-NE Office of Fuel Cycle Technologies Used Fuel Disposition (UFD) Campaign project, we have conducted preliminary system-level analyses to support the development of a long-term strategy for geologic disposal of high-level radioactive waste. A general modeling framework consisting of a near- and a far-field submodel for a granite GDSE was developed. A representative far-field transport model for a generic granite repository was merged with an integrated systems (GoldSim) near-field model. Integrated Monte Carlo model runs with the combined near- and farfield transport modelsmore » were performed, and the parameter sensitivities were evaluated for the combined system. In addition, a sub-set of radionuclides that are potentially important to repository performance were identified and evaluated for a series of model runs. The analyses were conducted with different waste inventory scenarios. Analyses were also conducted for different repository radionuelide release scenarios. While the results to date are for a generic granite repository, the work establishes the method to be used in the future to provide guidance on the development of strategy for long-term disposal of high-level radioactive waste in a granite repository.« less

  16. Genome resource banking for wildlife research, management, and conservation.

    PubMed

    Wildt, D E

    2000-01-01

    Cryobiology offers an important opportunity to assist in the management and study of wildlife, including endangered species. The benefits of developing genome resource banks for wildlife are profound, perhaps more so than for traditional uses in terms of livestock and human fertility. In addition to preserving heterozygosity and assisting in the genetic management of rare populations held in captivity, frozen repositories help insure wild populations against natural and human-induced catastrophes. Such banks also are an invaluable source of new knowledge (for basic and applied research) from thousands of species that have yet to be studied. However, it is crucial that genome resource banks for wildlife species be developed in a coordinated fashion that first benefits the conservation of biodiversity. Spurious collections will be of no advantage to genuine conservation. The Conservation Breeding Specialist Group (CBSG; of the International Union for the Conservation of Nature and Natural Resources' Species Survival Commission) has promoted international dialogue on this topic. CBSG working groups have recognized that such repositories be developed according to specific, scientific guidelines consistent with an international standard that ensures practicality, high-quality ethics, and cost-effectiveness. Areas requiring priority attention also are reviewed, including the need for more basic research, advocacy, and support for developing organized repositories of biomaterials representing the world's diverse biota.

  17. Modeling of irradiated graphite (14)C transfer through engineered barriers of a generic geological repository in crystalline rocks.

    PubMed

    Poskas, Povilas; Grigaliuniene, Dalia; Narkuniene, Asta; Kilda, Raimondas; Justinavicius, Darius

    2016-11-01

    There are two RBMK-1500 type graphite moderated reactors at the Ignalina nuclear power plant in Lithuania, and they are under decommissioning now. The graphite cannot be disposed of in a near surface repository, because of large amounts of (14)C. Therefore, disposal of the graphite in a geological repository is a reasonable solution. This study presents evaluation of the (14)C transfer by the groundwater pathway into the geosphere from the irradiated graphite in a generic geological repository in crystalline rocks and demonstration of the role of the different components of the engineered barrier system by performing local sensitivity analysis. The speciation of the released (14)C into organic and inorganic compounds as well as the most recent information on (14)C source term was taken into account. Two alternatives were considered in the analysis: disposal of graphite in containers with encapsulant and without it. It was evaluated that the maximal fractional flux of inorganic (14)C into the geosphere can vary from 10(-11)y(-1) (for non-encapsulated graphite) to 10(-12)y(-1) (for encapsulated graphite) while of organic (14)C it was about 10(-3)y(-1) of its inventory. Such difference demonstrates that investigations on the (14)C inventory and chemical form in which it is released are especially important. The parameter with the highest influence on the maximal flux into the geosphere for inorganic (14)C transfer was the sorption coefficient in the backfill and for organic (14)C transfer - the backfill hydraulic conductivity. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Making Research Data Repositories Visible: The re3data.org Registry

    PubMed Central

    Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe

    2013-01-01

    Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org–Registry of Research Data Repositories–has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data. PMID:24223762

  19. 3D numerical modelling of the thermal state of deep geological nuclear waste repositories

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, Yu. N.

    2017-09-01

    One of the important aspects of the high-level radioactive waste (HLW) disposal in deep geological repositories is ensuring the integrity of the engineered barriers which is, among other phenomena, considerably influenced by the thermal loads. As the HLW produce significant amount of heat, the design of the repository should maintain the balance between the cost-effectiveness of the construction and the sufficiency of the safety margins, including those imposed on the thermal conditions of the barriers. The 3D finite-element computer code FENIA was developed as a tool for simulation of thermal processes in deep geological repositories. Further the models for mechanical phenomena and groundwater hydraulics will be added resulting in a fully coupled thermo-hydro-mechanical (THM) solution. The long-term simulations of the thermal state were performed for two possible layouts of the repository. One was based on the proposed project of Russian repository, and another features larger HLW amount within the same space. The obtained results describe the spatial and temporal evolution of the temperature filed inside the repository and in the surrounding rock for 3500 years. These results show that practically all generated heat was ultimately absorbed by the host rock without any significant temperature increase. Still in the short time span even in case of smaller amount of the HLW the temperature maximum exceeds 100 °C, and for larger amount of the HLW the local temperature remains above 100 °C for considerable time. Thus, the substantiation of the long-term stability of the repository would require an extensive study of the materials properties and behaviour in order to remove the excessive conservatism from the simulations and to reduce the uncertainty of the input data.

  20. 17 CFR 49.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... data repository. (10) Position. The term “position” means the gross and net notional amounts of open... Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES... directly, or indirectly, controls, is controlled by, or is under common control with, the swap data...

  1. NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION ON THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2001-10-01

    The NGDRS has attained 72% of its targeted goal for cores and cuttings transfers, with over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. Additionally, large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale of the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghousemore » is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remained actively involved in assisting the National Research Council with background materials and presentations for their panel convened to study the data preservation issue. A final report of the panel is expected in early 2002. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less

  2. NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2003-04-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less

  3. NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2002-10-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less

  4. SNOMED CT module-driven clinical archetype management.

    PubMed

    Allones, J L; Taboada, M; Martinez, D; Lozano, R; Sobrido, M J

    2013-06-01

    To explore semantic search to improve management and user navigation in clinical archetype repositories. In order to support semantic searches across archetypes, an automated method based on SNOMED CT modularization is implemented to transform clinical archetypes into SNOMED CT extracts. Concurrently, query terms are converted into SNOMED CT concepts using the search engine Lucene. Retrieval is then carried out by matching query concepts with the corresponding SNOMED CT segments. A test collection of the 16 clinical archetypes, including over 250 terms, and a subset of 55 clinical terms from two medical dictionaries, MediLexicon and MedlinePlus, were used to test our method. The keyword-based service supported by the OpenEHR repository offered us a benchmark to evaluate the enhancement of performance. In total, our approach reached 97.4% precision and 69.1% recall, providing a substantial improvement of recall (more than 70%) compared to the benchmark. Exploiting medical domain knowledge from ontologies such as SNOMED CT may overcome some limitations of the keyword-based systems and thus improve the search experience of repository users. An automated approach based on ontology segmentation is an efficient and feasible way for supporting modeling, management and user navigation in clinical archetype repositories. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Lingering radioactivity at the Bikini and Enewetak Atolls.

    PubMed

    Buesseler, Ken O; Charette, Matthew A; Pike, Steven M; Henderson, Paul B; Kipp, Lauren E

    2018-04-15

    We made an assessment of the levels of radionuclides in the ocean waters, seafloor and groundwater at Bikini and Enewetak Atolls where the US conducted nuclear weapons tests in the 1940's and 50's. This included the first estimates of submarine groundwater discharge (SGD) derived from radium isotopes that can be used here to calculate radionuclide fluxes in to the lagoon waters. While there is significant variability between sites and sample types, levels of plutonium ( 239,240 Pu) remain several orders of magnitude higher in lagoon seawater and sediments than what is found in rest of the world's oceans. In contrast, levels of cesium-137 ( 137 Cs) while relatively elevated in brackish groundwater are only slightly higher in the lagoon water relative to North Pacific surface waters. Of special interest was the Runit dome, a nuclear waste repository created in the 1970's within the Enewetak Atoll. Low seawater ratios of 240 Pu/ 239 Pu suggest that this area is the source of about half of the Pu in the Enewetak lagoon water column, yet radium isotopes suggest that SGD from below the dome is not a significant Pu source. SGD fluxes of Pu and Cs at Bikini were also relatively low. Thus radioactivity associated with seafloor sediments remains the largest source and long term repository for radioactive contamination. Overall, Bikini and Enewetak Atolls are an ongoing source of Pu and Cs to the North Pacific, but at annual rates that are orders of magnitude smaller than delivered via close-in fallout to the same area. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Disposal of disused sealed radiation sources in Boreholes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vicente, R.

    2007-07-01

    This paper gives a description of the concept of a geological repository for disposal of disused sealed radiation sources (DSRS) under development in the Institute of Energy and Nuclear Research (IPEN), in Brazil. DSRS represent a significant fraction of total activity of radioactive wastes to be managed. Most DSRS are collected and temporarily stored at IPEN. As of 2006, the total collected activity is 800 TBq in 7,508 industrial gauge or radiotherapy sources, 7.2 TBq in about 72,000 Americium-241 sources detached from lightning rods, and about 0,5 GBq in 20,857 sources from smoke detectors. The estimated inventory of sealed sourcesmore » in the country is 2.7 hundred thousand sources with 26 PBq. The proposed repository is designed to receive the total inventory of sealed sources. A description of the pre-disposal facilities at IPEN is also presented. (authors)« less

  7. High Integrity Can Design Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaber, E.L.

    1998-08-01

    The National Spent Nuclear Fuel Program is chartered with facilitating the disposition of DOE-owned spent nuclear fuel to allow disposal at a geologic repository. This is done through coordination with the repository program and by assisting DOE Site owners of SNF with needed information, standardized requirements, packaging approaches, etc. The High Integrity Can (HIC) will be manufactured to provide a substitute or barrier enhancement for normal fuel geometry and cladding. The can would be nested inside the DOE standardized canister which is designed to interface with the repository waste package. The HIC approach may provide the following benefits over typicalmore » canning approaches for DOE SNF. (a) It allows ready calculation and management of criticality issues for miscellaneous. (b) It segments and further isolates damaged or otherwise problem materials from normal SNF in the repository package. (c) It provides a very long term corrosion barrier. (d) It provides an extra internal pressure barrier for particulates, gaseous fission products, hydrogen, and water vapor. (e) It delays any potential release of fission products to the repository environment. (f) It maintains an additional level of fuel geometry control during design basis accidents, rock-fall, and seismic events. (g) When seal welded, it could provide the additional containment required for shipments involving plutonium content in excess of 20 Ci. (10 CFR 71.63.b) if integrated with an appropriate cask design. Long term corrosion protection is central to the HIC concept. The material selected for the HIC (Hastelloy C-22) has undergone extensive testing for repository service. The most severe theoretical interactions between iron, repository water containing chlorides and other repository construction materials have been tested. These expected chemical species have not been shown capable of corroding the selected HIC material. Therefore, the HIC should provide a significant barrier to DOE SNF dispersal long after most commercial SNF has degraded and begun moving into the repository environment.« less

  8. Determination of Uncertainties for +III and +IV Actinide Solubilities in the WIPP Geochemistry Model for the 2009 Compliance Recertification Application

    NASA Astrophysics Data System (ADS)

    Ismail, A. E.; Xiong, Y.; Nowak, E. J.; Brush, L. H.

    2009-12-01

    The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy (DOE) repository in southeast New Mexico for defense-related transuranic (TRU) waste. Every five years, the DOE is required to submit an application to the Environmental Protection Agency (EPA) demonstrating the WIPP’s continuing compliance with the applicable EPA regulations governing the repository. Part of this recertification effort involves a performance assessment—a probabilistic evaluation of the repository performance with respect to regulatory limits on the amount of releases from the repository to the accessible environment. One of the models used as part of the performance assessment process is a geochemistry model, which predicts solubilities of the radionuclides in the brines that may enter the repository in the different scenarios considered by the performance assessment. The dissolved actinide source term comprises actinide solubilities, which are input parameters for modeling the transport of radionuclides as a result of brine flow through and from the repository. During a performance assessment, the solubilities are modeled as the product of a “base” solubility determined from calculations based on the chemical conditions expected in the repository, and an uncertainty factor that describes the potential deviations of the model from expected behavior. We will focus here on a discussion of the uncertainties. To compute a cumulative distribution function (CDF) for the uncertainties, we compare published, experimentally measured solubility data to predictions made using the established WIPP geochemistry model. The differences between the solubilities observed for a given experiment and the calculated solubilities from the model are used to form the overall CDF, which is then sampled as part of the performance assessment. We will discuss the methodology used to update the CDF’s for the +III actinides, obtained from data for Nd, Am, and Cm, and the +IV actinides, obtained from data for Th, and present results for the calculations of the updated CDF’s. We compare the CDF’s to the distributions computed for the previous recertification, and discuss the potential impact of the changes on the geochemistry model. This research is funded by WIPP programs administered by the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  9. DSpace and customized controlled vocabularies

    NASA Astrophysics Data System (ADS)

    Skourlas, C.; Tsolakidis, A.; Kakoulidis, P.; Giannakopoulos, G.

    2015-02-01

    The open source platform of DSpace could be defined as a repository application used to provide access to digital resources. DSpace is installed and used by more than 1000 organizations worldwide. A predefined taxonomy of keyword, called the Controlled Vocabulary, can be used for describing and accessing the information items stored in the repository. In this paper, we describe how the users can create, and customize their own vocabularies. Various heterogeneous items, such as research papers, videos, articles and educational material of the repository, can be indexed in order to provide advanced search functionality using new controlled vocabularies.

  10. Thermoelastic analysis of spent fuel and high level radioactive waste repositories in salt. A semi-analytical solution. [JUDITH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St. John, C.M.

    1977-04-01

    An underground repository containing heat generating, High Level Waste or Spent Unreprocessed Fuel may be approximated as a finite number of heat sources distributed across the plane of the repository. The resulting temperature, displacement and stress changes may be calculated using analytical solutions, providing linear thermoelasticity is assumed. This report documents a computer program based on this approach and gives results that form the basis for a comparison between the effects of disposing of High Level Waste and Spent Unreprocessed Fuel.

  11. 17 CFR 49.22 - Chief compliance officer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ....22 Section 49.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA... 49, the term “board of directors” means the board of directors of a registered swap data repository, or for those swap data repositories whose organizational structure does not include a board of...

  12. 17 CFR 49.22 - Chief compliance officer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ....22 Section 49.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA... 49, the term “board of directors” means the board of directors of a registered swap data repository, or for those swap data repositories whose organizational structure does not include a board of...

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn D.

    Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less

  14. The Community as a Source of Pragmatic Input for Learners of Italian: The Multimedia Repository LIRA

    ERIC Educational Resources Information Center

    Zanoni, Greta

    2016-01-01

    This paper focuses on community participation within the LIRA project--Lingua/Cultura Italiana in Rete per l'Apprendimento (Italian language and culture for online learning). LIRA is a multimedia repository of e-learning materials aiming at recovering, preserving and developing the linguistic, pragmatic and cultural competences of second and third…

  15. Use of Digital Repositories by Chemistry Researchers: Results of a Survey

    ERIC Educational Resources Information Center

    Polydoratou, Panayiota

    2007-01-01

    Purpose: This paper aims to present findings from a survey that aimed to identify the issues around the use and linkage of source and output repositories and the chemistry researchers' expectations about their use. Design/methodology/approach: This survey was performed by means of an online questionnaire and structured interviews with academic and…

  16. Linking Big and Small Data Across the Social, Engineering, and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; de Sherbinin, A. M.; Levy, M. A.; Downs, R. R.

    2014-12-01

    The challenges of sustainable development cut across the social, health, ecological, engineering, and Earth sciences, across a wide range of spatial and temporal scales, and across the spectrum from basic to applied research and decision making. The rapidly increasing availability of data and information in digital form from a variety of data repositories, networks, and other sources provides new opportunities to link and integrate both traditional data holdings as well as emerging "big data" resources in ways that enable interdisciplinary research and facilitate the use of objective scientific data and information in society. Taking advantage of these opportunities not only requires improved technical and scientific data interoperability across disciplines, scales, and data types, but also concerted efforts to bridge gaps and barriers between key communities, institutions, and networks. Given the long time perspectives required in planning sustainable approaches to development, it is also imperative to address user requirements for long-term data continuity and stewardship by trustworthy repositories. We report here on lessons learned by CIESIN working on a range of sustainable development issues to integrate data across multiple repositories and networks. This includes CIESIN's roles in developing policy-relevant climate and environmental indicators, soil data for African agriculture, and exposure and risk measures for hazards, disease, and conflict, as well as CIESIN's participation in a range of national and international initiatives related both to sustainable development and to open data access, interoperability, and stewardship.

  17. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  18. DataUp: Helping manage and archive data within the researcher's workflow

    NASA Astrophysics Data System (ADS)

    Strasser, C.

    2012-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.

  19. Design and Development of an Institutional Repository at the Indian Institute of Technology Kharagpur

    ERIC Educational Resources Information Center

    Sutradhar, B.

    2006-01-01

    Purpose: To describe how an institutional repository (IR) was set up, using open source software, at the Indian Institute of Technology (IIT) in Kharagpur. Members of the IIT can publish their research documents in the IR for online access as well as digital preservation. Material in this IR includes instructional materials, records, data sets,…

  20. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    NASA Astrophysics Data System (ADS)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology Language (OWL), we demonstrate how a data integration solution based on this idea can be realized over different data repositories.

  1. Revision history aware repositories of computational models of biological systems.

    PubMed

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users. Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.

  2. One-loop effective actions and higher spins. Part II

    NASA Astrophysics Data System (ADS)

    Bonora, L.; Cvitan, M.; Prester, P. Dominis; Giaccari, S.; Štemberga, T.

    2018-01-01

    In this paper we continue and improve the analysis of the effective actions obtained by integrating out a scalar and a fermion field coupled to external symmetric sources, started in the previous paper. The first subject we study is the geometrization of the results obtained there, that is we express them in terms of covariant Jacobi tensors. The second subject concerns the treatment of tadpoles and seagull terms in order to implement off-shell covariance in the initial model. The last and by far largest part of the paper is a repository of results concerning all two point correlators (including mixed ones) of symmetric currents of any spin up to 5 and in any dimensions between 3 and 6. In the massless case we also provide formulas for any spin in any dimension.

  3. IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2003-10-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has been on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2003, including the discontinuation of the use of Java in future Microsoft operating systems. The recent donation of BPAmoco's Houston core facility to the Texas Bureau of Economic Geology has provided substantial short-term relief of the space constraints for public repository space.« less

  4. YUCCA MOUNTAIN PROJECT - A BRIEFING --

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NA

    2003-08-05

    This report has the following articles: Nuclear waste--a long-term national problem; Spent nuclear fuel; High-level radioactive waste; Radioactivity and the environment; Current storage methods; Disposal options; U.S. policy on nuclear waste; The focus on Yucca Mountain; The purpose and scope of the Yucca Mountain Project; The approach for permanently disposing of waste; The scientific studies at Yucca Mountain; The proposed design for a repository at Yucca Mountain; Natural and engineered barriers would work together to isolate waste; Meticulous science and technology to protect people and the environment; Licensing a repository; Transporting waste to a permanent repository; The Environmental Impact Statementmore » for a repository; Current status of the Yucca Mountain Project; and Further information available on the Internet.« less

  5. Criteria for the evaluation and certification of long-term digital archives in the earth sciences

    NASA Astrophysics Data System (ADS)

    Klump, Jens

    2010-05-01

    Digital information has become an indispensable part of our cultural and scientific heritage. Scientific findings, historical documents and cultural achievements are to a rapidly increasing extent being presented in electronic form - in many cases exclusively so. However, besides the invaluable advantages offered by this form, it also carries a serious disadvantage: users need to invest a great deal of technical effort in accessing the information. Also, the underlying technology is still undergoing further development at an exceptionally fast pace. The rapid obsolescence of the technology required to read the information combined with the frequently imperceptible physical decay of the media themselves represents a serious threat to preservation of the information content. Many data sets in earth science research are from observations that cannot be repeated. This makes these digital assets particularly valuable. Therefore, these data should be kept and made available for re-use long after the end of the project from which they originated. Since research projects only run for a relatively short period of time, it is advisable to shift the burden of responsibility for long-term data curation from the individual researcher to a trusted data repository or archive. But what makes a trusted data repository? Each trusted digital repository has its own targets and specifications. The trustworthiness of digital repositories can be tested and assessed on the basis of a criteria catalogue. This is the main focus of the work of the nestor working group "Trusted repositories - Certification". It identifies criteria which permit the trustworthiness of a digital repository to be evaluated, both at the organisational and technical levels. The criteria are defined in close collaboration with a wide range of different memory organisations, producers of information, experts and other interested parties. This open approach ensures a high degree of universal validity, suitability for daily practical use and also broad-based acceptance of the results. The criteria catalogue is also intended to present the option of documenting trustworthiness by means of certification in a standardised national or international process. The criteria catalogue is based on the Reference Model for an Open Archival Information System (OAIS, ISO 14721:2003) With its broad approach, the nestor criteria catalogue for trusted digital repositories has to remain on a high level of abstraction. For application in the earth sciences the evaluation criteria need to be transferred into the context of earth science data and their designated user community. This presentation offers a brief introduction to the problems surrounding the long-term preservation of digital objects. This introduction is followed by a proposed application of the criteria catalogue for trusted digital repositories to the context of earth science data and their long-term preservation.

  6. SPECIATE--EPA'S DATABASE OF SPECIATED EMISSION PROFILES

    EPA Science Inventory

    SPECIATE is EPA's repository of Total Organic Compound and Particulate Matter speciated profiles for a wide variety of sources. The profiles in this system are provided for air quality dispersion modeling and as a library for source-receptor and source apportionment type models. ...

  7. SATORI: a system for ontology-guided visual exploration of biomedical data repositories.

    PubMed

    Lekschas, Fritz; Gehlenborg, Nils

    2018-04-01

    The ever-increasing number of biomedical datasets provides tremendous opportunities for re-use but current data repositories provide limited means of exploration apart from text-based search. Ontological metadata annotations provide context by semantically relating datasets. Visualizing this rich network of relationships can improve the explorability of large data repositories and help researchers find datasets of interest. We developed SATORI-an integrative search and visual exploration interface for the exploration of biomedical data repositories. The design is informed by a requirements analysis through a series of semi-structured interviews. We evaluated the implementation of SATORI in a field study on a real-world data collection. SATORI enables researchers to seamlessly search, browse and semantically query data repositories via two visualizations that are highly interconnected with a powerful search interface. SATORI is an open-source web application, which is freely available at http://satori.refinery-platform.org and integrated into the Refinery Platform. nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online.

  8. International Approaches for Nuclear Waste Disposal in Geological Formations: Report on Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter

    2016-09-01

    The goal of the Fifth Worldwide Review is to document evolution in the state-of-the-art of approaches for nuclear waste disposal in geological formations since the Fourth Worldwide Review that was released in 2006. The last ten years since the previous Worldwide Review has seen major developments in a number of nations throughout the world pursuing geological disposal programs, both in preparing and reviewing safety cases for the operational and long-term safety of proposed and operating repositories. The countries that are approaching implementation of geological disposal will increasingly focus on the feasibility of safely constructing and operating their repositories in short-more » and long terms on the basis existing regulations. The WWR-5 will also address a number of specific technical issues in safety case development along with the interplay among stakeholder concerns, technical feasibility, engineering design issues, and operational and post-closure safety. Preparation and publication of the Fifth Worldwide Review on nuclear waste disposal facilitates assessing the lessons learned and developing future cooperation between the countries. The Report provides scientific and technical experiences on preparing for and developing scientific and technical bases for nuclear waste disposal in deep geologic repositories in terms of requirements, societal expectations and the adequacy of cases for long-term repository safety. The Chapters include potential issues that may arise as repository programs mature, and identify techniques that demonstrate the safety cases and aid in promoting and gaining societal confidence. The report will also be used to exchange experience with other fields of industry and technology, in which concepts similar to the design and safety cases are applied, as well to facilitate the public perception and understanding of the safety of the disposal approaches relative to risks that may increase over long times frames in the absence of a successful implementation of final dispositioning.« less

  9. The preliminary design and feasibility study of the spent fuel and high level waste repository in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valvoda, Z.; Holub, J.; Kucerka, M.

    1996-12-31

    In the year 1993, began the Program of Development of the Spent Fuel and High Level Waste Repository in the Conditions of the Czech Republic. During the first phase, the basic concept and structure of the Program has been developed, and the basic design criteria and requirements were prepared. In the conditions of the Czech Republic, only an underground repository in deep geological formation is acceptable. Expected depth is between 500 to 1000 meters and as host rock will be granites. A preliminary variant design study was realized in 1994, that analyzed the radioactive waste and spent fuel flow frommore » NPPs to the repository, various possibilities of transportation in accordance to the various concepts of spent fuel conditioning and transportation to the underground structures. Conditioning and encapsulation of spent fuel and/or radioactive waste is proposed on the repository site. Underground disposal structures are proposed at one underground floor. The repository will have reserve capacity for radioactive waste from NPPs decommissioning and for waste non acceptable to other repositories. Vertical disposal of unshielded canisters in boreholes and/or horizontal disposal of shielded canisters is studied. As the base term of the start up of the repository operation, the year 2035 has been established. From this date, a preliminary time schedule of the Project has been developed. A method of calculating leveled and discounted costs within the repository lifetime, for each of selected 5 variants, was used for economic calculations. Preliminary expected parametric costs of the repository are about 0,1 Kc ($0.004) per MWh, produced in the Czech NPPs. In 1995, the design and feasibility study has gone in more details to the technical concept of repository construction and proposed technologies, as well as to the operational phase of the repository. Paper will describe results of the 1995 design work and will present the program of the repository development in next period.« less

  10. Handling glacially induced faults in the assessment of the long term safety of a repository for spent nuclear fuel at Forsmark, Sweden

    NASA Astrophysics Data System (ADS)

    Munier, R.

    2011-12-01

    Located deep into the Baltic shield, far from active plate boundaries and volcanism, Swedish bedrock is characterised by a low frequency of earthquakes of small magnitudes. Yet, faults, predominantly in the Lapland region, offsetting the quarternary regolith ten meters or more, reveal that Swedish bedrock suffered from substantial earthquake activity in connection to the retreat of the latest continental glacier, Weichsel. Storage of nuclear wastes, hazardous for hundreds of thousand years, requires, firstly, isolation of radionuclides and, secondly, retardation of the nuclides should the barriers fail. Swedish regulations require that safety is demonstrated for a period of a million years. Consequently, the repository must be designed to resist the impact of several continental glaciers. Large, glacially induced, earthquakes near the repository have the potential of triggering slip along fractures across the canisters containing the nuclear wastes, thereby simultaneously jeopardising isolation, retardation and, hence, long term safety. It has therefore been crucial to assess the impact of such intraplate earthquake upon the primary functions of the repository. We conclude that, by appropriate design of the repository, the negative impact of earthquakes on long term safety can be considerably lessened. We were, additionally, able to demonstrate compliance with Swedish regulations in our safety assessment, SR-Site, submitted to the authorities earlier this year. However, the assessment required a number of critical assumptions, e.g. concerning the strain rate and the fracture properties of the rock, many of which are subject of current research in the geoscientific community. By a conservative approach, though, we judge to have adequately propagated critical uncertainties through the assessment and bound the uncertainty space.

  11. Assessment of the long-term durability of concrete in radioactive waste repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, A.; Goult, D.J.; Hearne, J.A.

    1986-01-01

    A preliminary assessment of the long-term durability of concrete in a repository sited in clay is presented. The assessment is based on recorded experience of concrete structures and both field and laboratory studies. It is also supported by results of the examination of a concrete sample which had been buried in clay for 43 years. The engineering lifetime of a 1 m thick reinforced concrete slab, with one face in contact with clay, and the way in which pH in the repository as a whole is likely to vary with time have both been estimated from available data. The estimatesmore » indicate that engineering lifetimes of about 10/sup 3/ years are expected (providing that sulfate resisting cement is used) and that pH is likely to remain above 10.5 for about 10/sup 6/ years.« less

  12. CellBase, a comprehensive collection of RESTful web services for retrieving relevant biological information from heterogeneous sources.

    PubMed

    Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio

    2012-07-01

    During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.

  13. 2012 best practices for repositories collection, storage, retrieval, and distribution of biological materials for research international society for biological and environmental repositories.

    PubMed

    2012-04-01

    Third Edition [Formula: see text] [Box: see text] Printed with permission from the International Society for Biological and Environmental Repositories (ISBER) © 2011 ISBER All Rights Reserved Editor-in-Chief Lori D. Campbell, PhD Associate Editors Fay Betsou, PhD Debra Leiolani Garcia, MPA Judith G. Giri, PhD Karen E. Pitt, PhD Rebecca S. Pugh, MS Katherine C. Sexton, MBA Amy P.N. Skubitz, PhD Stella B. Somiari, PhD Individual Contributors to the Third Edition Jonas Astrin, Susan Baker, Thomas J. Barr, Erica Benson, Mark Cada, Lori Campbell, Antonio Hugo Jose Froes Marques Campos, David Carpentieri, Omoshile Clement, Domenico Coppola, Yvonne De Souza, Paul Fearn, Kelly Feil, Debra Garcia, Judith Giri, William E. Grizzle, Kathleen Groover, Keith Harding, Edward Kaercher, Joseph Kessler, Sarah Loud, Hannah Maynor, Kevin McCluskey, Kevin Meagher, Cheryl Michels, Lisa Miranda, Judy Muller-Cohn, Rolf Muller, James O'Sullivan, Karen Pitt, Rebecca Pugh, Rivka Ravid, Katherine Sexton, Ricardo Luis A. Silva, Frank Simione, Amy Skubitz, Stella Somiari, Frans van der Horst, Gavin Welch, Andy Zaayenga 2012 Best Practices for Repositories: Collection, Storage, Retrieval and Distribution of Biological Materials for Research INTERNATIONAL SOCIETY FOR BIOLOGICAL AND ENVIRONMENTAL REPOSITORIES (ISBER) INTRODUCTION T he availability of high quality biological and environmental specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens that will enable their future use. Sharing successful strategies for accomplishing this goal is one of the driving forces for the International Society for Biological and Environmental Repositories (ISBER). For more information about ISBER see www.isber.org . ISBER's Best Practices for Repositories (Best Practices) reflect the collective experience of its members and has received broad input from other repository professionals. Throughout this document effective practices are presented for the management of specimen collections and repositories. The term "Best Practice" is used in cases where a level of operation is indicated that is above the basic recommended practice or more specifically designates the most effective practice. It is understood that repositories in certain locations or with particular financial constraints may not be able to adhere to each of the items designated as "Best Practices". Repositories fitting into either of these categories will need to decide how they might best adhere to these recommendations within their particular circumstances. While adherence to ISBER Best Practices is strictly on a voluntary basis, it is important to note that some aspects of specimen management are governed by national/federal, regional and local regulations. The reader should refer directly to regulations for their national/federal, regional and local requirements, as appropriate. ISBER has strived to include terminology appropriate to the various specimen types covered under these practices, but here too, the reader should take steps to ensure the appropriateness of the recommendations to their particular repository type prior to the implementation of any new approaches. Important terms within the document are italicized when first used in a section and defined in the glossary. The ISBER Best Practices are periodically reviewed and revised to reflect advances in research and technology. The third edition of the Best Practices builds on the foundation established in the first and second editions which were published in 2005 and 2008, respectively.

  14. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources.

    PubMed

    Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

  15. Predictions of Actinide Solubilities under Near-Field Conditions Expected in the WIPP

    NASA Astrophysics Data System (ADS)

    Brush, L. H.; Xiong, Y.

    2009-12-01

    The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy (DOE) repository in southeast New Mexico for defense-related transuranic (TRU) waste. The repository, which opened in March 1999, is located at a subsurface depth of 655 m (2150 ft) in the Salado Fm., a Permian bedded-salt formation. The repository will eventually contain the equivalent of 844,000 208 L (55 gal) drums of TRU waste. After filling the rooms and access drifts and installing panel closures, creep closure of the salt will crush the steel waste containers in most cases and encapsulate the waste. The WIPP actinide source term model used for long-term performance assessment (PA) of the repository comprises dissolved and suspended submodels (solubilities and colloids). This presentation will describe the solubilities. From the standpoint of long-term PA, the order of importance of the radioelements in the TRU waste to be emplaced in the WIPP is Pu ~ Am >> U > Th >> Np ~ Cm and fission products. The DOE has included all of these actinides, but not fission products, in the WIPP Actinide Source Term Program (ASTP). Anoxic corrosion of Fe- and Al-base metals and microbial consumption of cellulosic, plastic, and rubber materials will produce gas and create strongly reducing conditions in the WIPP after closure. The use of MgO as an engineered barrier to consume microbially produced CO2 will result in low fCO2 and basic pH. Under these conditions, Th, U, Np, Pu, and Am will speciate essentially entirely as Th(IV), U(IV), Np(IV), Pu(III), and Am(III); or Th(IV), U(VI), Np(V), Pu(IV), and Am(III). The DOE has developed thermodynamic speciation-and-solubility models for +III, +IV, and +V actinides in brines. Experimental data for Nd, Am, and Cm species were used to parameterize the +III Pitzer activity-coefficient model; data for Th species were used for the +IV model; and data for Np(V) species were used for the +V model. These models include the effects of the organic ligands acetate, citrate, EDTA, and oxalate in TRU waste. The oxidation-state analogy was then used to extend the +III model to Pu(III), and the +IV model to Pu(IV), U(IV), and Np(IV). The solubility of U(VI) was estimated. For the recent WIPP Compliance Recertification Application PA Baseline Calculations, we calculated actinide solubilities with fCO2 buffered at 3.14 × 10-6 atm by the brucite-hydromagnesite carbonation reaction, with pH maintained at ~9 by the brucite dissolution-precipitation reaction, and with estimated concentrations of the organic ligands in brines from the Salado and the Castile Fm., which underlies the Salado. The calculated +III, +IV, and +V solubilities are 1.56 × 10-6, 5.64 × 10-8, and 4.07 × 10-7 M, respectively, in Salado brine; and 1.51 × 10-6, 6.98 × 10-8, and 8.75 × 10-7 M in Castile brine. The U(VI) solubility estimated for both brines is 1 × 10-3 M. This research is funded by WIPP programs administered by the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Avoidable Waste in Ophthalmic Epidemiology: A Review of Blindness Prevalence Surveys in Low and Middle Income Countries 2000-2014.

    PubMed

    Ramke, Jacqueline; Kuper, Hannah; Limburg, Hans; Kinloch, Jennifer; Zhu, Wenhui; Lansingh, Van C; Congdon, Nathan; Foster, Allen; Gilbert, Clare E

    2018-02-01

    Sources of avoidable waste in ophthalmic epidemiology include duplication of effort, and survey reports remaining unpublished, gaining publication after a long delay, or being incomplete or of poor quality. The aim of this review was to assess these sources of avoidable waste by examining blindness prevalence surveys undertaken in low and middle income countries (LMICs) between 2000 and 2014. On December 1, 2016 we searched MEDLINE, EMBASE and Web of Science databases for cross-sectional blindness prevalence surveys undertaken in LMICs between 2000 and 2014. All surveys listed on the Rapid Assessment of Avoidable Blindness (RAAB) Repository website ("the Repository") were also considered. For each survey we assessed (1) availability of scientific publication, survey report, summary results tables and/or datasets; (2) time to publication from year of survey completion and journal attributes; (3) extent of blindness information reported; and (4) rigour when information was available from two sources (i.e. whether it matched). Of the 279 included surveys (from 68 countries) 186 (67%) used RAAB methodology; 146 (52%) were published in a scientific journal, 57 (20%) were published in a journal and on the Repository, and 76 (27%) were on the Repository only (8% had tables; 19% had no information available beyond registration). Datasets were available for 50 RAABs (18% of included surveys). Time to publication ranged from <1 to 11 years (mean, standard deviation 2.8 ± 1.8 years). The extent of blindness information reported within studies varied (e.g. presenting and best-corrected, unilateral and bilateral); those with both a published report and Repository tables were most complete. For surveys published and with RAAB tables available, discrepancies were found in reporting of participant numbers (14% of studies) and blindness prevalence (15%). Strategies are needed to improve the availability, consistency, and quality of information reported from blindness prevalence surveys, and hence reduce avoidable waste.

  17. Interoperability Across the Stewardship Spectrum in the DataONE Repository Federation

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Vieglais, D.; Wilson, B. E.

    2016-12-01

    Thousands of earth and environmental science repositories serve many researchers and communities, each with their own community and legal mandates, sustainability models, and historical infrastructure. These repositories span the stewardship spectrum from highly curated collections that employ large numbers of staff members to review and improve data, to small, minimal budget repositories that accept data caveat emptor and where all responsibility for quality lies with the submitter. Each repository fills a niche, providing services that meet the stewardship tradeoffs of one or more communities. We have reviewed these stewardship tradeoffs for several DataONE member repositories ranging from minimally (KNB) to highly curated (Arctic Data Center), as well as general purpose (Dryad) to highly discipline or project specific (NEON). The rationale behind different levels of stewardship reflect resolution of these tradeoffs. Some repositories aim to encourage extensive uptake by keeping processes simple and minimizing the amount of information collected, but this limits the long-term utility of the data and the search, discovery, and integration systems that are possible. Other repositories require extensive metadata input, review, and assessment, allowing for excellent preservation, discovery, and integration but at the cost of significant time for submitters and expense for curatorial staff. DataONE recognizes these different levels of curation, and attempts to embrace them to create a federation that is useful across the stewardship spectrum. DataONE provides a tiered model for repositories with growing utility of DataONE services at higher tiers of curation. The lowest tier supports read-only access to data and requires little more than title and contact metadata. Repositories can gradually phase in support for higher levels of metadata and services as needed. These tiered capabilities are possible through flexible support for multiple metadata standards and services, where repositories can incrementally increase their requirements as they want to satisfy more use cases. Within DataONE, metadata search services support minimal metadata models, but significantly expanded precision and recall become possible when repositories provide more extensively curated metadata.

  18. Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.

  19. Geochemical Results of Lysimeter Sampling at the Manning Canyon Repository in the Mercur Mining District, Utah

    USGS Publications Warehouse

    Earle, John; Choate, LaDonna

    2010-01-01

    This report presents chemical characteristics of transient unsaturated-zone water collected by lysimeter from the Manning Canyon repository site in Utah. Data collected by U.S. Geological Survey and U.S. Department of the Interior, Bureau of Land Management scientists under an intragovernmental order comprise the existing body of hydrochemical information on unsaturated-zone conditions at the site and represent the first effort to characterize the chemistry of the soil pore water surrounding the repository. Analyzed samples showed elevated levels of arsenic, barium, chromium, and strontium, which are typical of acidic mine drainage. The range of major-ion concentrations generally showed expected soil values. Although subsequent sampling is necessary to determine long-term effects of the repository, current results provide initial data concerning reactive processes of precipitation on the mine tailings and waste rock stored at the site and provide information on the effectiveness of reclamation operations at the Manning Canyon repository.

  20. Next-Generation Search Engines for Information Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devarakonda, Ranjeet; Hook, Leslie A; Palanisamy, Giri

    In the recent years, there have been significant advancements in the areas of scientific data management and retrieval techniques, particularly in terms of standards and protocols for archiving data and metadata. Scientific data is rich, and spread across different places. In order to integrate these pieces together, a data archive and associated metadata should be generated. Data should be stored in a format that can be retrievable and more importantly it should be in a format that will continue to be accessible as technology changes, such as XML. While general-purpose search engines (such as Google or Bing) are useful formore » finding many things on the Internet, they are often of limited usefulness for locating Earth Science data relevant (for example) to a specific spatiotemporal extent. By contrast, tools that search repositories of structured metadata can locate relevant datasets with fairly high precision, but the search is limited to that particular repository. Federated searches (such as Z39.50) have been used, but can be slow and the comprehensiveness can be limited by downtime in any search partner. An alternative approach to improve comprehensiveness is for a repository to harvest metadata from other repositories, possibly with limits based on subject matter or access permissions. Searches through harvested metadata can be extremely responsive, and the search tool can be customized with semantic augmentation appropriate to the community of practice being served. One such system, Mercury, a metadata harvesting, data discovery, and access system, built for researchers to search to, share and obtain spatiotemporal data used across a range of climate and ecological sciences. Mercury is open-source toolset, backend built on Java and search capability is supported by the some popular open source search libraries such as SOLR and LUCENE. Mercury harvests the structured metadata and key data from several data providing servers around the world and builds a centralized index. The harvested files are indexed against SOLR search API consistently, so that it can render search capabilities such as simple, fielded, spatial and temporal searches across a span of projects ranging from land, atmosphere, and ocean ecology. Mercury also provides data sharing capabilities using Open Archive Initiatives Protocol for Metadata Handling (OAI-PMH). In this paper we will discuss about the best practices for archiving data and metadata, new searching techniques, efficient ways of data retrieval and information display.« less

  1. Long-term oxygen depletion from infiltrating groundwaters: Model development and application to intra-glaciation and glaciation conditions

    NASA Astrophysics Data System (ADS)

    Sidborn, M.; Neretnieks, I.

    2008-08-01

    Processes that control the redox conditions in deep groundwaters have been studied. The understanding of such processes in a long-term perspective is important for the safety assessment of a deep geological repository for high-level nuclear waste. An oxidising environment at the depth of the repository would increase the solubility and mobility of many radionuclides, and increase the potential risk for radioactive contamination at the ground surface. Proposed repository concepts also include engineered barriers such as copper canisters, the corrosion of which increases considerably in an oxidising environment compared to prevailing reducing conditions. Swedish granitic rocks are typically relatively sparsely fractured and are best treated as a dual-porosity medium with fast flowing channels through fractures in the rock with a surrounding porous matrix, the pores of which are accessible from the fracture by diffusive transport. Highly simplified problems have been explored with the aim to gain understanding of the underlying transport processes, thermodynamics and chemical reaction kinetics. The degree of complexity is increased successively, and mechanisms and processes identified as of key importance are included in a model framework. For highly complex models, analytical expressions are not fully capable of describing the processes involved, and in such cases the solutions are obtained by numerical calculations. Deep in the rock the main source for reducing capacity is identified as reducing minerals. Such minerals are found inside the porous rock matrix and as infill particles or coatings in fractures in the rock. The model formulation also allows for different flow modes such as flow along discrete fractures in sparsely fractured rocks and along flowpaths in a fracture network. The scavenging of oxygen is exemplified for these cases as well as for more comprehensive applications, including glaciation considerations. Results show that chemical reaction kinetics control the scavenging of oxygen during a relatively short time with respect to the lifetime of the repository. For longer times the scavenging of oxygen is controlled by transport processes in the porous rock matrix. The penetration depth of oxygen along the flowpath depends largely on the hydraulic properties, which may vary significantly between different locations and situations. The results indicate that oxygen, in the absence of easily degradable organic matter, may reach long distances along a flow path during the life-time of the repository (hundreds to thousands of metres in a million years depending on e.g. hydraulic properties of the flow path and the availability of reducing capacity). However, large uncertainties regarding key input parameters exist leading to the conclusion that the results from the model must be treated with caution pending more accurate and validated data. Ongoing and planned experiments are expected to reduce these uncertainties, which are required in order to make more reliable predictions for a safety assessment of a nuclear waste repository.

  2. Childhood Vesicoureteral Reflux Studies: Registries and Repositories Sources and Nosology

    PubMed Central

    Chesney, Russell W.; Patters, Andrea B.

    2012-01-01

    Despite several recent studies, the advisability of antimicrobial prophylaxis and certain imaging studies for urinary tract infections (UTIs) remains controversial. The role of vesicoureteral reflux (VUR) on the severity and re-infection rates for UTIs is also difficult to assess. Registries and repositories of data and biomaterials from clinical studies in children with VUR are valuable. Disease registries are collections of secondary data related to patients with a specific diagnosis, condition or procedure. Registries differ from indices in that they contain more extensive data. A research repository is an entity that receives, stores, processes and/or disseminates specimens (or other materials) as needed. It encompasses the physical location as well as the full range of activities associated with its operation. It may also be referred to as a biorepository. This report provides information about some current registries and repositories that include data and samples from children with VUR. It also describes the heterogeneous nature of the subjects, as some registries and repositories include only data or samples from patients with primary reflux while others also include those from patients with syndromic or secondary reflux. PMID:23044377

  3. Potential benefits of waste transmutation to the U.S. high-level waste respository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaels, G.E.

    1995-10-01

    This paper reexamines the potential benefits of waste transmutation to the proposed U.S. geologic repository at the Yucca Mountain site based on recent progress in the performance assessment for the Yucca Mountain base case of spent fuel emplacement. It is observed that actinides are assumed to have higher solubility than in previous studies and that Np and other actinides now dominate the projected aqueous releases from a Yucca Mountain repository. Actinides are also indentified as the dominant source of decay heat in the repository, and the effect of decay heat in perturbing the hydrology, geochemistry, and thermal characteristics of Yuccamore » Mountain are reviewed. It is concluded that the potential for thermally-driven, buoyant, gas-phase flow at Yucca Mountain introduces data and modeling requirements that will increase the costs of licensing the site and may cause the site to be unattractive for geologic disposal of wastes. A transmutation-enabled cold repository is proposed that might allow licensing of a repository to be based upon currently observable characteristics of the Yucca Mountain site.« less

  4. Shared Medical Imaging Repositories.

    PubMed

    Lebre, Rui; Bastião, Luís; Costa, Carlos

    2018-01-01

    This article describes the implementation of a solution for the integration of ownership concept and access control over medical imaging resources, making possible the centralization of multiple instances of repositories. The proposed architecture allows the association of permissions to repository resources and delegation of rights to third entities. It includes a programmatic interface for management of proposed services, made available through web services, with the ability to create, read, update and remove all components resulting from the architecture. The resulting work is a role-based access control mechanism that was integrated with Dicoogle Open-Source Project. The solution has several application scenarios like, for instance, collaborative platforms for research and tele-radiology services deployed at Cloud.

  5. The Center for HIV/AIDS Vaccine Immunology (CHAVI) Multi-site Quality Assurance Program for Cryopreserved Human Peripheral Blood Mononuclear Cells

    PubMed Central

    Sarzotti-Kelsoe, Marcella; Needham, Leila K.; Rountree, Wes; Bainbridge, John; Gray, Clive M.; Fiscus, Susan A.; Ferrari, Guido; Stevens, Wendy S.; Stager, Susan L.; Binz, Whitney; Louzao, Raul; Long, Kristy O.; Mokgotho, Pauline; Moodley, Niranjini; Mackay, Melanie; Kerkau, Melissa; McMillion, Takesha; Kirchherr, Jennifer; Soderberg, Kelly A.; Haynes, Barton F.; Denny, Thomas N.

    2014-01-01

    The Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium was established to determine the host and virus factors associated with HIV transmission, infection and containment of virus replication, with the goal of advancing the development of an HIV protective vaccine. Studies to meet this goal required the use of cryopreserved Peripheral Blood Mononuclear Cell (PBMC) specimens, and therefore it was imperative that a quality assurance (QA) oversight program be developed to monitor PBMC samples obtained from study participants at multiple international sites. Nine site-affiliated laboratories in Africa and the USA collected and processed PBMCs, and cryopreserved PBMC were shipped to CHAVI repositories in Africa and the USA for long-term storage. A three-stage program was designed, based on Good Clinical Laboratory Practices (GCLP), to monitor PBMC integrity at each step of this process. The first stage evaluated the integrity of fresh PBMCs for initial viability, overall yield, and processing time at the site-affiliated laboratories (Stage 1); for the second stage, the repositories determined post-thaw viability and cell recovery of cryopreserved PBMC, received from the site-affiliated laboratories (Stage 2); the third stage assessed the long-term specimen storage at each repository (Stage 3). Overall, the CHAVI PBMC QA oversight program results highlight the relative importance of each of these stages to the ultimate goal of preserving specimen integrity from peripheral blood collection to long-term repository storage. PMID:24910414

  6. Birds of a Feather - Developments towards shared, regional geological disposal in the EU?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Codee, H.D.K.; Verhoef, E.V.; McCombie, Ch.

    2008-07-01

    Geological disposal is an essential component of the long-term management of spent fuel, high level and other long-lived radioactive waste. In the EU, all 25 member states generate radioactive waste. Of course, there are large differences in type and quantity between the member states, but all of them need a long-term solution. Even a country with only lightning rods with radium will need a long-term solution for the disposal. The 1600 year half-life of radium does not fit in a solution with a span of control of just a few hundred years. Implementation of a suitable deep repository may, however,more » be difficult or impossible for countries with small volumes of waste, because of the high costs involved. Will economy of scale force these birds of a feather to wait to flock together and share a repository? Implementing a small repository and operating it for very long times is very costly. There are past and current examples of countries being prepared to accept radioactive waste from others if a better environmental solution is thus achieved and if the arrangements are fair for all parties involved. The need for supranational surveillance also points to shared solutions. Although the European Parliament and the Commission have both supported the concept of shared regional repositories in Europe, (national) political and societal constraints have hampered the realization of such facilities up to now. The first step in this staged process was the EC funded project, SAPIERR I. The project (2003 to 2005) studied the feasibility of shared regional storage facilities and geological repositories, for use by European countries. It showed that, if shared regional repositories are to be implemented even some decades ahead, efforts must already be increased now. The next step in the process is to develop a practical implementation strategy and organizational structures to work on shared EU radioactive waste storage and disposal activities. This is addressed in the EC funded project SAPIERR II (2006-2008). The paper gives an update of the SAPIERR II project and describes the progress achieved. (authors)« less

  7. SPECIATE Version 4.4 Database Development Documentation

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency’s (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Some of the many uses of these source profiles include: (1) creating speciated emissions inventories for regi...

  8. SPECIATE 4.2: speciation Database Development Documentation

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Among the many uses of speciation data, these source profiles are used to: (1) create speciated emissions inve...

  9. New Rapid Evaluation for Long-Term Behavior in Deep Geological Repository by Geotechnical Centrifuge—Part 2: Numerical Simulation of Model Tests in Isothermal Condition

    NASA Astrophysics Data System (ADS)

    Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji

    2017-01-01

    In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.

  10. Long-Term Information Management (LTIM) of Safeguards Data at Repositories: Phase II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haddal, Risa N.

    One of the challenges of implementing safeguards for geological repositories will be the long-term preservation of safeguards-related data for 100 years or more. While most countries considering the construction and operation of such facilities agree that safeguards information should be preserved, there are gaps with respect to standardized requirements, guidelines, timescales, and approaches. This study analyzes those gaps and explores research to clarify stakeholder needs, identify current policies, approaches, best practices and international standards, and explores existing safeguards information management infrastructure. The study also attempts to clarify what a safeguards data classification system might look like, how long data shouldmore » be retained, and how information should be exchanged between stakeholders at different phases of a repository’s life cycle. The analysis produced a variety of recommendations on what information to preserve, how to preserve it, where to store it, retention options and how to exchange information in the long term. Key findings include the use of the globally recognized international records management standard, ISO15489, for guidance on the development of information management systems, and the development of a Key Information File (KIF). The KIF could be used to identify only the most relevant, high-level safeguards information and the history of decision making about the repository. The study also suggests implementing on-site and off-site records storage in digital and physical form; developing a safeguards data classification system; long-term records retention with periodic reviews every 5 to 10 years during each phase of the repository life cycle; and establishing transition procedures well in advance so that data shepherds and records officers can transfer information with incoming facility managers effectively and efficiently. These and other recommendations are further analyzed in this study.« less

  11. Repository Profiles for Atmospheric and Climate Sciences: Capabilities and Trends in Data Services

    NASA Astrophysics Data System (ADS)

    Hou, C. Y.; Thompson, C. A.; Palmer, C. L.

    2014-12-01

    As digital research data proliferate and expectations for open access escalate, the landscape of data repositories is becoming more complex. For example, DataBib currently identifies 980 data repositories across the disciplines, with 117 categorized under Geosciences. In atmospheric and climate sciences, there are great expectations for the integration and reuse of data for advancing science. To realize this potential, resources are needed that explicate the range of repository options available for locating and depositing open data, their conditions of access and use, and the services and tools they provide. This study profiled 38 open digital repositories in the atmospheric and climate sciences, analyzing each on 55 criteria through content analysis of their websites. The results provide a systematic way to assess and compare capabilities, services, and institutional characteristics and identify trends across repositories. Selected results from the more detailed outcomes to be presented: Most repositories offer guidance on data format(s) for submission and dissemination. 42% offer authorization-free access. More than half use some type of data identification system such as DOIs. Nearly half offer some data processing, with a similar number providing software or tools. 78.9% request that users cite or acknowledge datasets used and the data center. Only 21.1% recommend specific metadata standards, such as ISO 19115 or Dublin Core, with more than half utilizing a customized metadata scheme. Information was rarely provided on repository certification and accreditation and uneven for transfer of rights and data security. Few provided policy information on preservation, migration, reappraisal, disposal, or long-term sustainability. As repository use increases, it will be important for institutions to make their procedures and policies explicit, to build trust with user communities and improve efficiencies in data sharing. Resources such as repository profiles will be essential for scientists to weigh options and understand trends in data services across the evolving network of repositories.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.C. Ryman

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operatingmore » conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the Quality Assurance Requirements and Description (Ref. 7.28). The performance of the calculation and development of this document are carried out in accordance with AP-3.124, ''Design Calculation and Analyses'' (Ref. 7.29).« less

  13. SPECIATE 4.4: The Bridge Between Emissions Characterization and Modeling

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency’s (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Some of the many uses of these source profiles include: (1) creating speciated emissions inventories for...

  14. The Development and Uses of EPA's SPECIATE Database

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of volatile organic compounds (VOC) and particulate matter (PM) speciation profiles of air pollution sources. These source profiles can be used to (l) provide input to chemical mass balance (CMB) receptor mod...

  15. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.

    PubMed

    Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu

    2006-01-11

    Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  16. A Preliminary Performance Assessment for Salt Disposal of High-Level Nuclear Waste - 12173

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Clayton, Daniel; Jove-Colon, Carlos

    2012-07-01

    A salt repository is one of the four geologic media currently under study by the U.S. DOE Office of Nuclear Energy to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic salt repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a salt formation. The current phase of this study considers representative geologic settings and features adopted from previous studiesmore » for salt repository sites. For the reference scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small. For the human intrusion (or disturbed) scenario, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario. Actinides including Pu-239, Pu-242 and Np-237 are major annual dose contributors, and the calculated peak mean annual dose is acceptably low. A performance assessment model for a generic salt repository has been developed incorporating, where applicable, representative geologic settings and features adopted from literature data for salt repository sites. The conceptual model and scenario for radionuclide release and transport from a salt repository were developed utilizing literature data. The salt GDS model was developed in a probabilistic analysis framework. The preliminary performance analysis for demonstration of model capability is for an isothermal condition at the ambient temperature for the near field. The capability demonstration emphasizes key attributes of a salt repository that are potentially important to the long-term safe disposal of UNF and HLW. The analysis presents and discusses the results showing repository responses to different radionuclide release scenarios (undisturbed and human intrusion). For the reference (or nominal or undisturbed) scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 (non-sorbing and unlimited solubility with a very long half-life) is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small that there is no meaningful consequence for the repository performance. For the human intrusion (or disturbed) scenario analysis, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario analysis. Compared to the reference scenario, the relative annual dose contributions by soluble, non-sorbing fission products, particularly I-129, are much lower than by actinides including Pu-239, Pu-242 and Np-237. The lower relative mean annual dose contributions by the fission product radionuclides are due to their lower total inventory available for release (i.e., up to five affected waste packages), and the higher mean annual doses by the actinides are the outcome of the direct release of the radionuclides into the overlying aquifer having high water flow rates, thereby resulting in an early arrival of higher concentrations of the radionuclides at the biosphere drinking water well prior to their significant decay. The salt GDS model analysis has also identified the following future recommendations and/or knowledge gaps to improve and enhance the confidence of the future repository performance analysis. - Repository thermal loading by UNF and HLW, and the effect on the engineered barrier and near-field performance. - Closure and consolidation of salt rocks by creep deformation under the influence of thermal perturbation, and the effect on the engineered barrier and near-field performance. - Brine migration and radionuclide transport under the influence of thermal perturbation in generic salt repository environment, and the effect on the engineered barrier and near-field performance and far-field performance. - Near-field geochemistry and radionuclide mobility in generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Degradation of engineer barrier components (waste package, waste canister, waste forms, etc.) in a generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Waste stream types and inventory estimates, particularly for reprocessing high-level waste. (authors)« less

  17. Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.

    2016-12-01

    There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.

  18. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 3. Generator routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.; Argo, R.S.

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. The various input parameters required in the analysis are compiled in data systems. The data are organized and preparedmore » by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System, a storage and retrieval system for model input and output data, including graphical interpretation and display is described. This is the third of four volumes of the description of the CIRMIS Data System.« less

  19. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 1. Initialization, operation, and documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. The various input parameters required in the analysis are compiled in data systems. The data are organized and preparedmore » by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System, a storage and retrieval system for model input and output data, including graphical interpretation and display is described. This is the first of four volumes of the description of the CIRMIS Data System.« less

  20. Geologic, geochemical rock mechanics and hydrologic characteristics of candidate repository horizons

    NASA Astrophysics Data System (ADS)

    Long, P. E.; Apted, M. J.; Spane, F. A., Jr.; Kim, K.

    1982-09-01

    The feasibility of constructing a nuclear waste repository in basalt (NWRB) on the Hanford Site is determined. Studies conducted indicate feasibility and performance requirements are within a significant safety margin. The two most promising candidate repository horizons for an NWRB are the middle Sentinel Bluffs and the Umtanum flows. Both of these flows are laterally continuous and have thicknesses of competent rock adequate to accommodate a repository. Significant geologic differences between the two flows are their depth, total thickness, and variability of flow top thickness. These differences are considered in selection of one of the two flows for breakout from an exploratory shaft. The geochemical characteristics of both the middle Sentinel Bluffs flow and the Umtanum flow favor long term isolation of radionuclides by providing an environment in which canister corrosion rates and solubility of many radionuclide bearing solids is relatively low.

  1. Proposed BioRepository platform solution for the ALS research community.

    PubMed

    Sherman, Alex; Bowser, Robert; Grasso, Daniela; Power, Breen; Milligan, Carol; Jaffa, Matthew; Cudkowicz, Merit

    2011-01-01

    ALS is a rare disorder whose cause and pathogenesis is largely unknown ( 1 ). There is a recognized need to develop biomarkers for ALS to better understand the disease, expedite diagnosis and to facilitate therapy development. Collaboration is essential to obtain a sufficient number of samples to allow statistically meaningful studies. The availability of high quality biological specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens. The value of biological samples to scientists and clinicians correlates with the completeness and relevance of phenotypical and clinical information associated with the samples ( 2 , 3 ). While developing a secure Web-based system to manage an inventory of multi-site BioRepositories, algorithms were implemented to facilitate ad hoc parametric searches across heterogeneous data sources that contain data from clinical trials and research studies. A flexible schema for a barcode label was introduced to allow association of samples to these data. The ALSBank™ BioRepository platform solution for managing biological samples and associated data is currently deployed by the Northeast ALS Consortium (NEALS). The NEALS Consortium and the Massachusetts General Hospital (MGH) Neurology Clinical Trials Unit (NCTU) support a network of multiple BioBanks, thus allowing researchers to take advantage of a larger specimen collection than they might have at an individual institution. Standard operating procedures are utilized at all collection sites to promote common practices for biological sample integrity, quality control and associated clinical data. Utilizing this platform, we have created one of the largest virtual collections of ALS-related specimens available to investigators studying ALS.

  2. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping.

    PubMed

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon's conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team.

  3. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

    PubMed Central

    Wu, Tai-luan; Tseng, Ling-li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327

  4. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  5. Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Gomez, S. P.; Matteo, E. N.

    2017-12-01

    Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.

  6. Evaluation of Used Fuel Disposition in Clay-Bearing Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jové Colón, Carlos F.; Weck, Philippe F.; Sassani, David H.

    2014-08-01

    Radioactive waste disposal in shale/argillite rock formations has been widely considered given its desirable isolation properties (low permeability), geochemically reduced conditions, anomalous groundwater pressures, and widespread geologic occurrence. Clay/shale rock formations are characterized by their high content of clay minerals such as smectites and illites where diffusive transport and chemisorption phenomena predominate. These, in addition to low permeability, are key attributes of shale to impede radionuclide mobility. Shale host-media has been comprehensively studied in international nuclear waste repository programs as part of underground research laboratories (URLs) programs in Switzerland, France, Belgium, and Japan. These investigations, in some cases a decademore » or more long, have produced a large but fundamental body of information spanning from site characterization data (geological, hydrogeological, geochemical, geomechanical) to controlled experiments on the engineered barrier system (EBS) (barrier clay and seals materials). Evaluation of nuclear waste disposal in shale formations in the USA was conducted in the late 70’s and mid 80’s. Most of these studies evaluated the potential for shale to host a nuclear waste repository but not at the programmatic level of URLs in international repository programs. This report covers various R&D work and capabilities relevant to disposal of heat-generating nuclear waste in shale/argillite media. Integration and cross-fertilization of these capabilities will be utilized in the development and implementation of the shale/argillite reference case planned for FY15. Disposal R&D activities under the UFDC in the past few years have produced state-of-the-art modeling capabilities for coupled Thermal-Hydrological-Mechanical-Chemical (THMC), used fuel degradation (source term), and thermodynamic modeling and database development to evaluate generic disposal concepts. The THMC models have been developed for shale repository leveraging in large part on the information garnered in URLs and laboratory data to test and demonstrate model prediction capability and to accurately represent behavior of the EBS and the natural (barrier) system (NS). In addition, experimental work to improve our understanding of clay barrier interactions and TM couplings at high temperatures are key to evaluate thermal effects as a result of relatively high heat loads from waste and the extent of sacrificial zones in the EBS. To assess the latter, experiments and modeling approaches have provided important information on the stability and fate of barrier materials under high heat loads. This information is central to the assessment of thermal limits and the implementation of the reference case when constraining EBS properties and the repository layout (e.g., waste package and drift spacing). This report is comprised of various parts, each one describing various R&D activities applicable to shale/argillite media. For example, progress made on modeling and experimental approaches to analyze physical and chemical interactions affecting clay in the EBS, NS, and used nuclear fuel (source term) in support of R&D objectives. It also describes the development of a reference case for shale/argillite media. The accomplishments of these activities are summarized as follows: Development of a reference case for shale/argillite; Investigation of Reactive Transport and Coupled THM Processes in EBS: FY14; Update on Experimental Activities on Buffer/Backfill Interactions at elevated Pressure and Temperature; and Thermodynamic Database Development: Evaluation Strategy, Modeling Tools, First-Principles Modeling of Clay, and Sorption Database Assessment;ANL Mixed Potential Model For Used Fuel Degradation: Application to Argillite and Crystalline Rock Environments.« less

  7. Using FLOSS Project Metadata in the Undergraduate Classroom

    NASA Astrophysics Data System (ADS)

    Squire, Megan; Duvall, Shannon

    This paper describes our efforts to use the large amounts of data available from public repositories of free, libre, and open source software (FLOSS) in our undergraduate classrooms to teach concepts that would have previously been taught using other types of data from other sources.

  8. CIRMIS Data system. Volume 2. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less

  9. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 4. Driller's logs, stratigraphic cross section and utility routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less

  10. Inter-rater reliability and review of the VA unresolved narratives.

    PubMed Central

    Eagon, J. C.; Hurdle, J. F.; Lincoln, M. J.

    1996-01-01

    To better understand how VA clinicians use medical vocabulary in every day practice, we set out to characterize terms generated in the Problem List module of the VA's DHCP system that were not mapped to terms in the controlled-vocabulary lexicon of DHCP. When entered terms fail to match those in the lexicon, a note is sent to a central repository. When our study started, the volume in that repository had reached 16,783 terms. We wished to characterize the potential reasons why these terms failed to match terms in the lexicon. After examining two small samples of randomly selected terms, we used group consensus to develop a set of rating criteria and a rating form. To be sure that the results of multiple reviewers could be confidently compared, we analyzed the inter-rater agreement of our rating process. Two rates used this form to rate the same 400 terms. We found that modifiers and numeric data were common and consistent reasons for failure to match, while others such as use of synonyms and absence of the concept from the lexicon were common but less consistently selected. PMID:8947642

  11. Inter-rater reliability and review of the VA unresolved narratives.

    PubMed

    Eagon, J C; Hurdle, J F; Lincoln, M J

    1996-01-01

    To better understand how VA clinicians use medical vocabulary in every day practice, we set out to characterize terms generated in the Problem List module of the VA's DHCP system that were not mapped to terms in the controlled-vocabulary lexicon of DHCP. When entered terms fail to match those in the lexicon, a note is sent to a central repository. When our study started, the volume in that repository had reached 16,783 terms. We wished to characterize the potential reasons why these terms failed to match terms in the lexicon. After examining two small samples of randomly selected terms, we used group consensus to develop a set of rating criteria and a rating form. To be sure that the results of multiple reviewers could be confidently compared, we analyzed the inter-rater agreement of our rating process. Two rates used this form to rate the same 400 terms. We found that modifiers and numeric data were common and consistent reasons for failure to match, while others such as use of synonyms and absence of the concept from the lexicon were common but less consistently selected.

  12. Simulation of fluid flow and energy transport processes associated with high-level radioactive waste disposal in unsaturated alluvium

    USGS Publications Warehouse

    Pollock, David W.

    1986-01-01

    Many parts of the Great Basin have thick zones of unsaturated alluvium which might be suitable for disposing of high-level radioactive wastes. A mathematical model accounting for the coupled transport of energy, water (vapor and liquid), and dry air was used to analyze one-dimensional, vertical transport above and below an areally extensive repository. Numerical simulations were conducted for a hypothetical repository containing spent nuclear fuel and located 100 m below land surface. Initial steady state downward water fluxes of zero (hydrostatic) and 0.0003 m yr−1were considered in an attempt to bracket the likely range in natural water flux. Predicted temperatures within the repository peaked after approximately 50 years and declined slowly thereafter in response to the decreasing intensity of the radioactive heat source. The alluvium near the repository experienced a cycle of drying and rewetting in both cases. The extent of the dry zone was strongly controlled by the mobility of liquid water near the repository under natural conditions. In the case of initial hydrostatic conditions, the dry zone extended approximately 10 m above and 15 m below the repository. For the case of a natural flux of 0.0003 m yr−1 the relative permeability of water near the repository was initially more than 30 times the value under hydrostatic conditions, consequently the dry zone extended only about 2 m above and 5 m below the repository. In both cases a significant perturbation in liquid saturation levels persisted for several hundred years. This analysis illustrates the extreme sensitivity of model predictions to initial conditions and parameters, such as relative permeability and moisture characteristic curves, that are often poorly known.

  13. Evolution in Metadata Quality: Common Metadata Repository's Role in NASA Curation Efforts

    NASA Technical Reports Server (NTRS)

    Gilman, Jason; Shum, Dana; Baynes, Katie

    2016-01-01

    Metadata Quality is one of the chief drivers of discovery and use of NASA EOSDIS (Earth Observing System Data and Information System) data. Issues with metadata such as lack of completeness, inconsistency, and use of legacy terms directly hinder data use. As the central metadata repository for NASA Earth Science data, the Common Metadata Repository (CMR) has a responsibility to its users to ensure the quality of CMR search results. This poster covers how we use humanizers, a technique for dealing with the symptoms of metadata issues, as well as our plans for future metadata validation enhancements. The CMR currently indexes 35K collections and 300M granules.

  14. Childhood vesicoureteral reflux studies: registries and repositories sources and nosology.

    PubMed

    Chesney, Russell W; Patters, Andrea B

    2013-12-01

    Despite several recent studies, the advisability of antimicrobial prophylaxis and certain imaging studies for urinary tract infections (UTIs) remains controversial. The role of vesicoureteral reflux (VUR) on the severity and re-infection rates for UTIs is also difficult to assess. Registries and repositories of data and biomaterials from clinical studies in children with VUR are valuable. Disease registries are collections of secondary data related to patients with a specific diagnosis, condition or procedure. Registries differ from indices in that they contain more extensive data. A research repository is an entity that receives, stores, processes and/or disseminates specimens (or other materials) as needed. It encompasses the physical location as well as the full range of activities associated with its operation. It may also be referred to as a biorepository. This report provides information about some current registries and repositories that include data and samples from children with VUR. It also describes the heterogeneous nature of the subjects, as some registries and repositories include only data or samples from patients with primary reflux while others also include those from patients with syndromic or secondary reflux. Copyright © 2012 Journal of Pediatric Urology Company. All rights reserved.

  15. Industrial Program of Waste Management - Cigeo Project - 13033

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butez, Marc; Bartagnon, Olivier; Gagner, Laurent

    2013-07-01

    The French Planning Act of 28 June 2006 prescribed that a reversible repository in a deep geological formation be chosen as the reference solution for the long-term management of high-level and intermediate-level long-lived radioactive waste. It also entrusted the responsibility of further studies and design of the repository (named Cigeo) upon the French Radioactive Waste Management Agency (Andra), in order for the review of the creation-license application to start in 2015 and, subject to its approval, the commissioning of the repository to take place in 2025. Andra is responsible for siting, designing, implementing, operating the future geological repository, including operationalmore » and long term safety and waste acceptance. Nuclear operators (Electricite de France (EDF), AREVA NC, and the French Commission in charge of Atomic Energy and Alternative Energies (CEA) are technically and financially responsible for the waste they generate, with no limit in time. They provide Andra, on one hand, with waste packages related input data, and on the other hand with their long term industrial experiences of high and intermediate-level long-lived radwaste management and nuclear operation. Andra, EDF, AREVA and CEA established a cooperation agreement for strengthening their collaborations in these fields. Within this agreement Andra and the nuclear operators have defined an industrial program for waste management. This program includes the waste inventory to be taken into account for the design of the Cigeo project and the structural hypothesis underlying its phased development. It schedules the delivery of the different categories of waste and defines associated flows. (authors)« less

  16. A scoping review of online repositories of quality improvement projects, interventions and initiatives in healthcare.

    PubMed

    Bytautas, Jessica P; Gheihman, Galina; Dobrow, Mark J

    2017-04-01

    Quality improvement (QI) is becoming an important focal point for health systems. There is increasing interest among health system stakeholders to learn from and share experiences on the use of QI methods and approaches in their work. Yet there are few easily accessible, online repositories dedicated to documenting QI activity. We conducted a scoping review of publicly available, web-based QI repositories to (i) identify current approaches to sharing information on QI practices; (ii) categorise these approaches based on hosting, scope and size, content acquisition and eligibility, content format and search, and evaluation and engagement characteristics; and (iii) review evaluations of the design, usefulness and impact of their online QI practice repositories. The search strategy consisted of traditional database and grey literature searches, as well as expert consultation, with the ultimate aim of identifying and describing QI repositories of practices undertaken in a healthcare context. We identified 13 QI repositories and found substantial variation across the five categories. The QI repositories used different terminology (eg, practices vs case studies) and approaches to content acquisition, and varied in terms of primary areas of focus. All provided some means for organising content according to categories or themes and most provided at least rudimentary keyword search functionality. Notably, none of the QI repositories included evaluations of their impact. With growing interest in sharing and spreading best practices and increasing reliance on QI as a key contributor to health system performance, the role of QI repositories is likely to expand. Designing future QI repositories based on knowledge of the range and type of features available is an important starting point for improving their usefulness and impact. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. BioPortal: An Open-Source Community-Based Ontology Repository

    NASA Astrophysics Data System (ADS)

    Noy, N.; NCBO Team

    2011-12-01

    Advances in computing power and new computational techniques have changed the way researchers approach science. In many fields, one of the most fruitful approaches has been to use semantically aware software to break down the barriers among disparate domains, systems, data sources, and technologies. Such software facilitates data aggregation, improves search, and ultimately allows the detection of new associations that were previously not detectable. Achieving these analyses requires software systems that take advantage of the semantics and that can intelligently negotiate domains and knowledge sources, identifying commonality across systems that use different and conflicting vocabularies, while understanding apparent differences that may be concealed by the use of superficially similar terms. An ontology, a semantically rich vocabulary for a domain of interest, is the cornerstone of software for bridging systems, domains, and resources. However, as ontologies become the foundation of all semantic technologies in e-science, we must develop an infrastructure for sharing ontologies, finding and evaluating them, integrating and mapping among them, and using ontologies in applications that help scientists process their data. BioPortal [1] is an open-source on-line community-based ontology repository that has been used as a critical component of semantic infrastructure in several domains, including biomedicine and bio-geochemical data. BioPortal, uses the social approaches in the Web 2.0 style to bring structure and order to the collection of biomedical ontologies. It enables users to provide and discuss a wide array of knowledge components, from submitting the ontologies themselves, to commenting on and discussing classes in the ontologies, to reviewing ontologies in the context of their own ontology-based projects, to creating mappings between overlapping ontologies and discussing and critiquing the mappings. Critically, it provides web-service access to all its content, enabling its integration in semantically enriched applications. [1] Noy, N.F., Shah, N.H., et al., BioPortal: ontologies and integrated data resources at the click of a mouse. Nucleic Acids Res, 2009. 37(Web Server issue): p. W170-3.

  18. Trends in the Evolution of the Public Web, 1998-2002; The Fedora Project: An Open-source Digital Object Repository Management System; State of the Dublin Core Metadata Initiative, April 2003; Preservation Metadata; How Many People Search the ERIC Database Each Day?

    ERIC Educational Resources Information Center

    O'Neill, Edward T.; Lavoie, Brian F.; Bennett, Rick; Staples, Thornton; Wayland, Ross; Payette, Sandra; Dekkers, Makx; Weibel, Stuart; Searle, Sam; Thompson, Dave; Rudner, Lawrence M.

    2003-01-01

    Includes five articles that examine key trends in the development of the public Web: size and growth, internationalization, and metadata usage; Flexible Extensible Digital Object and Repository Architecture (Fedora) for use in digital libraries; developments in the Dublin Core Metadata Initiative (DCMI); the National Library of New Zealand Te Puna…

  19. DSA-WDS Common Requirements: Developing a New Core Data Repository Certification

    NASA Astrophysics Data System (ADS)

    Minster, J. B. H.; Edmunds, R.; L'Hours, H.; Mokrane, M.; Rickards, L.

    2016-12-01

    The Data Seal of Approval (DSA) and the International Council for Science - World Data System (ICSU-WDS) have both developed minimally intensive core certification standards whereby digital repositories supply evidence that they are trustworthy and have a long-term outlook. Both DSA and WDS applicants have found core certification to be beneficial: building stakeholder confidence, enhancing the repository's reputation, and demonstrating that it is following good practices; as well as stimulating the repository to focus on processes and procedures, thereby achieving ever higher levels of professionalism over time.The DSA and WDS core certifications evolved independently serving initially different communities but both initiatives are multidisciplinary with catalogues of criteria and review procedures based on the same principles. Hence, to realize efficiencies, simplify assessment options, stimulate more certifications, and increase impact on the community, the Repository Audit and Certification DSA-WDS Partnership Working Group (WG) was established under the umbrella of the Research Data Alliance (RDA). The WG conducted a side-by-side analysis of both frameworks to unify the wording and criteria, ultimately leading to a harmonized Catalogue of Common Requirements for core certification of repositories—as well as a set of Common Procedures for their assessment.This presentation will focus on the collaborative effort by DSA and WDS to establish (1) a testbed comprising DSA and WDS certified data repositories to validate both the new Catalogue and Procedures, and (2) a joint Certification Board towards their practical implementation. We will describe:• The purpose and methodology of the testbed, including selection of repositories to be assessed against the common standard.• The results of the testbed, with an in-depth look at some of the comments received and issues highlighted.• General insights gained from evaluating the testbed results, the subsequent changes to the Common Requirements and Procedures, and an assessment of the success of these enhancements.• Steps by the two organizations to integrate the Common Certification into their tools and systems. In particular, the creation of Terms of Reference for the nascent DSA-WDS Certification Board.

  20. A Comprehensive Repository of Normal and Tumor Human Breast Tissues and Cells

    DTIC Science & Technology

    1999-07-01

    mother was reported to have had cancer of the uterine cervix at the age of 22. Both maternal grandparents had died of colon cancer in their sixties...1 mutation). The repository also includes breast epithelial and stromal cell strains derived from non cancerous breast tissue as well as peripheral...tissue banks. 14. SUBJECT TERMS Breast Cancer 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE

  1. The magnitude and relevance of the February 2014 radiation release from the Waste Isolation Pilot Plant repository in New Mexico, USA.

    PubMed

    Thakur, P; Lemons, B G; White, C R

    2016-09-15

    After almost fifteen years of successful waste disposal operations, the first unambiguous airborne radiation release from the Waste Isolation Pilot Plant (WIPP) was detected beyond the site boundary on February 14, 2014. It was the first accident of its kind in the 15-year operating history of the WIPP. The accident released moderate levels of radioactivity into the underground air. A small but measurable amount of radioactivity also escaped to the surface through the ventilation system and was detected above ground. The dominant radionuclides released were americium and plutonium, in a ratio consistent with the known content of a breached drum. The radiation release was caused by a runaway chemical reaction inside a transuranic (TRU) waste drum which experienced a seal and lid failure, spewing radioactive materials into the repository. According to source-term estimation, approximately 2 to 10Ci of radioactivity was released from the breached drum into the underground, and an undetermined fraction of that source term became airborne, setting off an alarm and triggering the closure of seals designed to force exhausting air through a system of filters including high-efficiency-particulate-air (HEPA) filters. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment, if any. This article attempts to compile and interpret analytical data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and by a compliance-monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP), LLC., in response to the accident. Both the independent and the WIPP monitoring efforts concluded that the levels detected were very low and localized, and no radiation-related health effects among local workers or the public would be expected. Published by Elsevier B.V.

  2. SPECIATE 4.3: Addendum to SPECIATE 4.2--Speciation database development documentation

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. Among the many uses of speciation data, these source profiles are used to: (1) create speciated emissions inve...

  3. ACToR Chemical Structure processing using Open Source ChemInformatics Libraries (FutureToxII)

    EPA Science Inventory

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from ove...

  4. Electrical Resistance Tomography to Monitor Mitigation of Metal-Toxic Acid-Leachates Ruby Gulch Waste Rock Repository Gilt Edge Mine Superfund Site, South Dakota USA

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Heath, G.; Richardson, A.; Paul, D.; Wangerud, K.

    2003-12-01

    At a cyanide heap-leach open-pit mine, 15-million cubic yards of acid-generating sulfides were dumped at the head of a steep-walled mountain valley, with 30 inches/year precipitation generating 60- gallons/minute ARD leachate. Remediation has reshaped the dump to a 70-acre, 3.5:1-sloped geometry, installed drainage benches and runoff diversions, and capped the repository and lined diversions with a polyethylene geomembrane and cover system. Monitoring was needed to evaluate (a) long-term geomembrane integrity, (b) diversion liner integrity and long-term effectiveness, (c) ARD geochemistry, kinetics and pore-gas dynamics within the repository mass, and (d) groundwater interactions. Observation wells were paired with a 600-electrode resistivity survey system. Using near-surface and down-hole electrodes and automated data collection and post-processing, periodic two- and three-dimensional resistivity images are developed to reflect current and changed-conditions in moisture, temperature, geochemical components, and flow-direction analysis. Examination of total resistivity values and time variances between images allows direct observation of liner and cap integrity with precise identification and location of leaks; likewise, if runoff migrates from degraded diversion ditches into the repository zone, there is an accompanying and noticeable change in resistivity values. Used in combination with monitoring wells containing borehole resistivity electrodes (calibrated with direct sampling of dump water/moisture, temperature and pore-gas composition), the resistivity arrays allow at-depth imaging of geochemical conditions within the repository mass. The information provides early indications of progress or deficiencies in de-watering and ARD- mitigation that is the remedy intent. If emerging technologies present opportunities for secondary treatment, deep resistivity images may assist in developing application methods and evaluating the effectiveness of any reagents introduced into the repository mass to further effect changes in oxidation/reduction reactions.

  5. Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results

    NASA Astrophysics Data System (ADS)

    Nussbaum, C. O.; Bossart, P. J.

    2012-12-01

    Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.

  6. EPA’s SPECIATE 4.4 Database: Bridging Data Sources and Data Users

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA)repository of volatile organic gas and particulate matter (PM) speciation profiles for air pollution sources. EPA released SPECIATE 4.4 in early 2014 and, in total, the SPECIATE 4.4 database includes 5,728 PM, VOC, total...

  7. An algorithm to detect and communicate the differences in computational models describing biological systems.

    PubMed

    Scharm, Martin; Wolkenhauer, Olaf; Waltemath, Dagmar

    2016-02-15

    Repositories support the reuse of models and ensure transparency about results in publications linked to those models. With thousands of models available in repositories, such as the BioModels database or the Physiome Model Repository, a framework to track the differences between models and their versions is essential to compare and combine models. Difference detection not only allows users to study the history of models but also helps in the detection of errors and inconsistencies. Existing repositories lack algorithms to track a model's development over time. Focusing on SBML and CellML, we present an algorithm to accurately detect and describe differences between coexisting versions of a model with respect to (i) the models' encoding, (ii) the structure of biological networks and (iii) mathematical expressions. This algorithm is implemented in a comprehensive and open source library called BiVeS. BiVeS helps to identify and characterize changes in computational models and thereby contributes to the documentation of a model's history. Our work facilitates the reuse and extension of existing models and supports collaborative modelling. Finally, it contributes to better reproducibility of modelling results and to the challenge of model provenance. The workflow described in this article is implemented in BiVeS. BiVeS is freely available as source code and binary from sems.uni-rostock.de. The web interface BudHat demonstrates the capabilities of BiVeS at budhat.sems.uni-rostock.de. © The Author 2015. Published by Oxford University Press.

  8. Lauriston S. Taylor Lecture: Yucca mountain radiation standards, dose/risk assessments, thinking outside the box, evaluations, and recommendations.

    PubMed

    Moeller, Dade W

    2009-11-01

    The Yucca Mountain high-level radioactive waste repository is designed to contain spent nuclear fuel and vitrified fission products. Due to the fact that it will be the first such facility constructed anywhere in the world, it has proved to be one in which multiple organizations, most prominently the U.S. Congress, are exercising a role. In addition to selecting a site for the facility, Congress specified that the U.S. Environmental Protection Agency (U.S. EPA) promulgate the associated Standards, the U.S. Nuclear Regulatory Commission establish applicable Regulations to implement the Standards, and the U.S. Department of Energy (U.S. DOE) design, construct, and operate the repository. Congress also specified that U.S. EPA request that the National Academy of Sciences (NAS) provide them guidance on the form and nature of the Standards. In so doing, Congress also stipulated that the Standards be expressed in terms of an "equivalent dose rate." As will be noted, this subsequently introduced serious complications. Due to the inputs of so many groups, and the fact that the NAS recommendations conflicted with the Congressional stipulation that the limits be expressed in terms of a dose rate, the outcome is a set of Standards that not only does not comply with the NAS recommendations, but also is neither integrated, nor consistent. The initial goals of this paper are to provide an independent risk/dose analysis for each of the eight radionuclides that are to be regulated, and to evaluate them in terms of the Standards. These efforts reveal that the Standards are neither workable nor capable of being implemented. The concluding portions of the paper provide guidance that, if successfully implemented, would enable U.S. DOE to complete the construction of the repository and operate it in accordance with the recommendations of NAS while, at the same time, provide a better, more accurate, understanding of its potential risks to the public. This facility is too important to the U.S. nuclear energy program to be impeded by inappropriate Standards and unnecessary regulatory restrictions. As will be noted, the sources of essentially all of the recommendations suggested in this paper were derived through applications of the principles of good science, and the benefits of "thinking outside the box."

  9. Characterize Framework for Igneous Activity at Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    F. Perry; B. Youngs

    2000-11-06

    The purpose of this Analysis/Model (AMR) report is twofold. (1) The first is to present a conceptual framework of igneous activity in the Yucca Mountain region (YMR) consistent with the volcanic and tectonic history of this region and the assessment of this history by experts who participated in the Probabilistic Volcanic Hazard Analysis (PVHA) (CRWMS M&O 1996). Conceptual models presented in the PVHA are summarized and extended in areas in which new information has been presented. Alternative conceptual models are discussed as well as their impact on probability models. The relationship between volcanic source zones defined in the PVHA andmore » structural features of the YMR are described based on discussions in the PVHA and studies presented since the PVHA. (2) The second purpose of the AMR is to present probability calculations based on PVHA outputs. Probability distributions are presented for the length and orientation of volcanic dikes within the repository footprint and for the number of eruptive centers located within the repository footprint (conditional on the dike intersecting the repository). The probability of intersection of a basaltic dike within the repository footprint was calculated in the AMR ''Characterize Framework for Igneous Activity at Yucca Mountain, Nevada'' (CRWMS M&O 2000g) based on the repository footprint known as the Enhanced Design Alternative [EDA II, Design B (CRWMS M&O 1999a; Wilkins and Heath 1999)]. Then, the ''Site Recommendation Design Baseline'' (CRWMS M&O 2000a) initiated a change in the repository design, which is described in the ''Site Recommendation Subsurface Layout'' (CRWMS M&O 2000b). Consequently, the probability of intersection of a basaltic dike within the repository footprint has also been calculated for the current repository footprint, which is called the 70,000 Metric Tons of Uranium (MTU) No-Backfill Layout (CRWMS M&O 2000b). The calculations for both footprints are presented in this AMR. In addition, the probability of an eruptive center(s) forming within the repository footprint is calculated and presented in this AMR for both repository footprint designs. This latter type of calculation was not included in the PVHA.« less

  10. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  11. Proteomics data repositories

    PubMed Central

    Riffle, Michael; Eng, Jimmy K.

    2010-01-01

    The field of proteomics, particularly the application of mass spectrometry analysis to protein samples, is well-established and growing rapidly. Proteomics studies generate large volumes of raw experimental data and inferred biological results. To facilitate the dissemination of these data, centralized data repositories have been developed that make the data and results accessible to proteomics researchers and biologists alike. This review of proteomics data repositories focuses exclusively on freely-available, centralized data resources that disseminate or store experimental mass spectrometry data and results. The resources chosen reflect a current “snapshot” of the state of resources available with an emphasis placed on resources that may be of particular interest to yeast researchers. Resources are described in terms of their intended purpose and the features and functionality provided to users. PMID:19795424

  12. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    PubMed

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  13. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    PubMed Central

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  14. COMODI: an ontology to characterise differences in versions of computational models in biology.

    PubMed

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  15. Preliminary risk benefit assessment for nuclear waste disposal in space

    NASA Technical Reports Server (NTRS)

    Rice, E. E.; Denning, R. S.; Friedlander, A. L.; Priest, C. C.

    1982-01-01

    This paper describes the recent work of the authors on the evaluation of health risk benefits of space disposal of nuclear waste. The paper describes a risk model approach that has been developed to estimate the non-recoverable, cumulative, expected radionuclide release to the earth's biosphere for different options of nuclear waste disposal in space. Risk estimates for the disposal of nuclear waste in a mined geologic repository and the short- and long-term risk estimates for space disposal were developed. The results showed that the preliminary estimates of space disposal risks are low, even with the estimated uncertainty bounds. If calculated release risks for mined geologic repositories remain as low as given by the U.S. DOE, and U.S. EPA requirements continue to be met, then no additional space disposal study effort in the U.S. is warranted at this time. If risks perceived by the public are significant in the acceptance of mined geologic repositories, then consideration of space disposal as a complement to the mined geologic repository is warranted.

  16. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping

    PubMed Central

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon’s conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team. PMID:27134733

  18. XNAT Central: Open sourcing imaging research data.

    PubMed

    Herrick, Rick; Horton, William; Olsen, Timothy; McKay, Michael; Archie, Kevin A; Marcus, Daniel S

    2016-01-01

    XNAT Central is a publicly accessible medical imaging data repository based on the XNAT open-source imaging informatics platform. It hosts a wide variety of research imaging data sets. The primary motivation for creating XNAT Central was to provide a central repository to host and provide access to a wide variety of neuroimaging data. In this capacity, XNAT Central hosts a number of data sets from research labs and investigative efforts from around the world, including the OASIS Brains imaging studies, the NUSDAST study of schizophrenia, and more. Over time, XNAT Central has expanded to include imaging data from many different fields of research, including oncology, orthopedics, cardiology, and animal studies, but continues to emphasize neuroimaging data. Through the use of XNAT's DICOM metadata extraction capabilities, XNAT Central provides a searchable repository of imaging data that can be referenced by groups, labs, or individuals working in many different areas of research. The future development of XNAT Central will be geared towards greater ease of use as a reference library of heterogeneous neuroimaging data and associated synthetic data. It will also become a tool for making data available supporting published research and academic articles. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Approaching data publication as part of the scholarly communication enterprise: some obstacles, some solutions (Invited)

    NASA Astrophysics Data System (ADS)

    Vision, T. J.

    2010-12-01

    Many datasets collected by academic research teams, despite being difficult or impossible to effectively reproduce, are never shared with the wider community even after the findings based upon them appear in print. This limits the extent to which published scientific findings can be verified and cuts short the opportunity for data to realize their full potential value through reuse and repurposing. While many scientists perceive data to be public goods that should be made available upon publication, they also perceive limited incentive for doing so themselves. This, combined with the lack of mandates for data archiving and the absence of a trusted public repository that can host any kind of data, means that the practice of data archiving is rare. When data are shared post-publication, it is often through ad hoc mechanisms and under terms that present obstacles to reuse. When data are archived, it is generally achieved through routes that do not ensure preservation or discoverability. To address this mix of technical and sociocultural obstacles to data reuse, a consortium of journals in ecology and evolutionary biology recently launched a digital data repository (Dryad) and developed a joint policy mandating data archiving at the time of publication. Dryad has a number of features specifically designed to make it possible for universal data archiving to be achieved with low-burden and low-cost at the time of publication. These include a streamlined submission process through the exchange of metadata through integration with the manuscript processing system, handshaking with more specialized data repositories, and metadata curation that is assisted by automated generation of cataloging terms. To directly benefit data depositors, data are treated as a citable scholarly product through the assignment of trackable data DOIs. The data are permanently linked from the original article and are made freely available with an explicit waiver of restrictions to reuse. The Dryad Consortium, which includes both society-owned and publisher-owned journals, is responsible for governing and sustaining the repository. For scientists, Dryad provides a rich source of data for confirmation of findings, tests of new methodology, and synthetic studies. It also provides the means for depositors to tangibly increase the scientific impact of their work. For journals, Dryad archives data in a more permanent, feature-rich, and cost-effective way than through use of supplementary online materials. Despite its biological origins, Dryad provides a discipline-neutral model for including data fully within the fold of scholarly communication.

  20. Seven [Data] Habits of Highly Successful Researchers

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Shepherd, A.; Saito, M. A.; Wiebe, P. H.; Ake, H.; Biddle, M.; Copley, N. J.; Rauch, S.; Switzer, M. E.; York, A.

    2017-12-01

    Navigating the landscape of open science and data sharing can be daunting for the long-tail scientist. From satisfying funder requirements, and ensuring proper attribution for their work, to determining the best repository for data management and archive, there are several facets to be considered. Yet, there is no single source of guidance for investigators who may be using multiple research funding models. What role can existing repositories play to help facilitate a more effective data sharing workflow? The Biological and Chemical Oceanographic Data Management Office (BCO-DMO) is a domain-specific repository occupying the niche between funder and investigator. The office works closely with its stakeholders to develop and provide guidance, services, and tools that assist researchers in meeting their data sharing needs. From determining if BCO-DMO is the appropriate repository to manage an investigator's project data, to ensuring that investigator is able to fulfill funder requirements. The goal is to relieve the investigator of the more difficult aspects of data management and data sharing, while simultaneously educating them in better data management practices that will streamline the process of conducting open research in the future. This presentation will provide an overview of the BCO-DMO repository, highlighting some of the services and guidance the office provides to its community.

  1. Status of a standard for neutron skyshine calculation and measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, R.M.; Wright, R.Q.; Greenborg, J.

    1990-01-01

    An effort has been under way for several years to prepare a draft standard, ANS-6.6.2, Calculation and Measurement of Direct and Scattered Neutron Radiation from Contained Sources Due to Nuclear Power Operations. At the outset, the work group adopted a three-phase study involving one-dimensional analyses, a measurements program, and multi-dimensional analyses. Of particular interest are the neutron radiation levels associated with dry-fuel storage at reactor sites. The need for dry storage has been investigated for various scenarios of repository and monitored retrievable storage (MRS) facilities availability with the waste stream analysis model. The concern is with long-term integrated, low-level dosesmore » at long distances from a multiplicity of sources. To evaluate the conservatism associated with one-dimensional analyses, the work group has specified a series of simple problems. Sources as a function of fuel exposure were determined for a Westinghouse 17 x 17 pressurized water reactor assembly with the ORIGEN-S module of the SCALE system. The energy degradation of the 35 GWd/ton U sources was determined for two generic designs of dry-fuel storage casks.« less

  2. Colloid formation during waste form reaction: Implications for nuclear waste disposal

    USGS Publications Warehouse

    Bates, J. K.; Bradley, J.; Teetsov, A.; Bradley, C. R.; Buchholtz ten Brink, Marilyn R.

    1992-01-01

    Insoluble plutonium- and americium-bearing colloidal particles formed during simulated weathering of a high-level nuclear waste glass. Nearly 100 percent of the total plutonium and americium in test ground water was concentrated in these submicrometer particles. These results indicate that models of actinide mobility and repository integrity, which assume complete solubility of actinides in ground water, underestimate the potential for radionuclide release into the environment. A colloid-trapping mechanism may be necessary for a waste repository to meet long-term performance specifications.

  3. Evaluation of Groundwater Pathways and Travel Times From the Nevada Test Site to the Potential Yucca Mountain Repository

    NASA Astrophysics Data System (ADS)

    Pohlmann, K. F.; Zhu, J.; Ye, M.; Carroll, R. W.; Chapman, J. B.; Russell, C. E.; Shafer, D. S.

    2006-12-01

    Yucca Mountain (YM), Nevada has been recommended as a deep geological repository for the disposal of spent fuel and high-level radioactive waste. If YM is licensed as a repository by the Nuclear Regulatory Commission, it will be important to identify the potential for radionuclides to migrate from underground nuclear testing areas located on the Nevada Test Site (NTS) to the hydraulically downgradient repository area to ensure that monitoring does not incorrectly attribute repository failure to radionuclides originating from other sources. In this study, we use the Death Valley Regional Flow System (DVRFS) model developed by the U.S. Geological Survey to investigate potential groundwater migration pathways and associated travel times from the NTS to the proposed YM repository area. Using results from the calibrated DVRFS model and the particle tracking post-processing package MODPATH we modeled three-dimensional groundwater advective pathways in the NTS and YM region. Our study focuses on evaluating the potential for groundwater pathways between the NTS and YM withdrawal area and whether travel times for advective flow along these pathways coincide with the prospective monitoring time frame at the proposed repository. We include uncertainty in effective porosity as this is a critical variable in the determination of time for radionuclides to travel from the NTS region to the YM withdrawal area. Uncertainty in porosity is quantified through evaluation of existing site data and expert judgment and is incorporated in the model through Monte Carlo simulation. Since porosity information is limited for this region, the uncertainty is quite large and this is reflected in the results as a large range in simulated groundwater travel times.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    F. Perry; R. Youngs

    The purpose of this scientific analysis report is threefold: (1) Present a conceptual framework of igneous activity in the Yucca Mountain region (YMR) consistent with the volcanic and tectonic history of this region and the assessment of this history by experts who participated in the probabilistic volcanic hazard analysis (PVHA) (CRWMS M&O 1996 [DIRS 100116]). Conceptual models presented in the PVHA are summarized and applied in areas in which new information has been presented. Alternative conceptual models are discussed, as well as their impact on probability models. The relationship between volcanic source zones defined in the PVHA and structural featuresmore » of the YMR are described based on discussions in the PVHA and studies presented since the PVHA. (2) Present revised probability calculations based on PVHA outputs for a repository footprint proposed in 2003 (BSC 2003 [DIRS 162289]), rather than the footprint used at the time of the PVHA. This analysis report also calculates the probability of an eruptive center(s) forming within the repository footprint using information developed in the PVHA. Probability distributions are presented for the length and orientation of volcanic dikes located within the repository footprint and for the number of eruptive centers (conditional on a dike intersecting the repository) located within the repository footprint. (3) Document sensitivity studies that analyze how the presence of potentially buried basaltic volcanoes may affect the computed frequency of intersection of the repository footprint by a basaltic dike. These sensitivity studies are prompted by aeromagnetic data collected in 1999, indicating the possible presence of previously unrecognized buried volcanoes in the YMR (Blakely et al. 2000 [DIRS 151881]; O'Leary et al. 2002 [DIRS 158468]). The results of the sensitivity studies are for informational purposes only and are not to be used for purposes of assessing repository performance.« less

  5. Coupled Multi-physical Simulations for the Assessment of Nuclear Waste Repository Concepts: Modeling, Software Development and Simulation

    NASA Astrophysics Data System (ADS)

    Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.

    2016-12-01

    As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.

  6. Utilizing the Antarctic Master Directory to find orphan datasets

    NASA Astrophysics Data System (ADS)

    Bonczkowski, J.; Carbotte, S. M.; Arko, R. A.; Grebas, S. K.

    2011-12-01

    While most Antarctic data are housed at an established disciplinary-specific data repository, there are data types for which no suitable repository exists. In some cases, these "orphan" data, without an appropriate national archive, are served from local servers by the principal investigators who produced the data. There are many pitfalls with data served privately, including the frequent lack of adequate documentation to ensure the data can be understood by others for re-use and the impermanence of personal web sites. For example, if an investigator leaves an institution and the data moves, the link published is no longer accessible. To ensure continued availability of data, submission to long-term national data repositories is needed. As stated in the National Science Foundation Office of Polar Programs (NSF/OPP) Guidelines and Award Conditions for Scientific Data, investigators are obligated to submit their data for curation and long-term preservation; this includes the registration of a dataset description into the Antarctic Master Directory (AMD), http://gcmd.nasa.gov/Data/portals/amd/. The AMD is a Web-based, searchable directory of thousands of dataset descriptions, known as DIF records, submitted by scientists from over 20 countries. It serves as a node of the International Directory Network/Global Change Master Directory (IDN/GCMD). The US Antarctic Program Data Coordination Center (USAP-DCC), http://www.usap-data.org/, funded through NSF/OPP, was established in 2007 to help streamline the process of data submission and DIF record creation. When data does not quite fit within any existing disciplinary repository, it can be registered within the USAP-DCC as the fallback data repository. Within the scope of the USAP-DCC we undertook the challenge of discovering and "rescuing" orphan datasets currently registered within the AMD. In order to find which DIF records led to data served privately, all records relating to US data within the AMD were parsed. After identifying the records containing a URL leading to a national data center or other disciplinary data repository, the remaining records were individually inspected for data type, format, and quality of metadata and then assessed to determine how best to preserve. Of the records reviewed, those for which appropriate repositories could be identified were submitted. An additional 35 were deemed acceptable in quality of metadata to register in the USAP-DCC. The content of these datasets were varied in nature, ranging from penguin counts to paleo-geologic maps to results of meteorological models all of which are discoverable through our search interface, http://www.usap-data.org/search.php. The remaining 40 records linked to either no data or had inadequate documentation for preservation highlighting the danger of serving datasets on local servers where minimal metadata standards can not be enforced and long-term access can not be ensured.

  7. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  8. Network of anatomical texts (NAnaTex), an open-source project for visualizing the interaction between anatomical terms.

    PubMed

    Momota, Ryusuke; Ohtsuka, Aiji

    2018-01-01

    Anatomy is the science and art of understanding the structure of the body and its components in relation to the functions of the whole-body system. Medicine is based on a deep understanding of anatomy, but quite a few introductory-level learners are overwhelmed by the sheer amount of anatomical terminology that must be understood, so they regard anatomy as a dull and dense subject. To help them learn anatomical terms in a more contextual way, we started a new open-source project, the Network of Anatomical Texts (NAnaTex), which visualizes relationships of body components by integrating text-based anatomical information using Cytoscape, a network visualization software platform. Here, we present a network of bones and muscles produced from literature descriptions. As this network is primarily text-based and does not require any programming knowledge, it is easy to implement new functions or provide extra information by making changes to the original text files. To facilitate collaborations, we deposited the source code files for the network into the GitHub repository ( https://github.com/ryusukemomota/nanatex ) so that anybody can participate in the evolution of the network and use it for their own non-profit purposes. This project should help not only introductory-level learners but also professional medical practitioners, who could use it as a quick reference.

  9. Effects of microbial processes on gas generation under expected WIPP repository conditions: Annual report through 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francis, A.J.; Gillow, J.B.

    1993-09-01

    Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term (< 6 months) and long-term (> 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and undergroundmore » workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation.« less

  10. The benefits of a fast reactor closed fuel cycle in the UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregg, R.; Hesketh, K.

    2013-07-01

    The work has shown that starting a fast reactor closed fuel cycle in the UK, requires virtually all of Britain's existing and future PWR spent fuel to be reprocessed, in order to obtain the plutonium needed. The existing UK Pu stockpile is sufficient to initially support only a modest SFR 'closed' fleet assuming spent fuel can be reprocessed shortly after discharge (i.e. after two years cooling). For a substantial fast reactor fleet, most Pu will have to originate from reprocessing future spent PWR fuel. Therefore, the maximum fast reactor fleet size will be limited by the preceding PWR fleet size,more » so scenarios involving fast reactors still require significant quantities of uranium ore indirectly. However, once a fast reactor fuel cycle has been established, the very substantial quantities of uranium tails in the UK would ensure there is sufficient material for several centuries. Both the short and long term impacts on a repository have been considered in this work. Over the short term, the decay heat emanating from the HLW and spent fuel will limit the density of waste within a repository. For scenarios involving fast reactors, the only significant heat bearing actinide content will be present in the final cores, resulting in a 50% overall reduction in decay energy deposited within the repository when compared with an equivalent open fuel cycle. Over the longer term, radiological dose becomes more important. Total radiotoxicity (normalised by electricity generated) is lower for scenarios with Pu recycle after 2000 years. Scenarios involving fast reactors have the lowest radiotoxicity since the quantities of certain actinides (Np, Pu and Am) eventually stabilise. However, total radiotoxicity as a measure of radiological risk does not account for differences in radionuclide mobility once in repository. Radiological dose is dominated by a small number of fission products so is therefore not affected significantly by reactor type or recycling strategy (since the fission product will primarily be a function of nuclear energy generated). However, by reprocessing spent fuel, it is possible to immobilise the fission product in a more suitable waste form that has far more superior in-repository performance. (authors)« less

  11. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  12. YouTube as a source of information on rhinosinusitis: the good, the bad and the ugly.

    PubMed

    Biggs, T C; Bird, J H; Harries, P G; Salib, R J

    2013-08-01

    YouTube is an internet-based repository of user-generated content. This study aimed to determine whether YouTube represented a valid and reliable patient information resource for the lay person on the topic of rhinosinusitis. The study included the first 100 YouTube videos found using the search term 'sinusitis'. Videos were graded on their ability to inform the lay person on the subject of rhinosinusitis. Forty-five per cent of the videos were deemed to provide some useful information. Fifty-five per cent of the videos contained little or no useful facts, 27 per cent of which contained potentially misleading or even dangerous information. Videos uploaded by medical professionals or those from health information websites contained more useful information than those uploaded by independent users. YouTube appears to be an unreliable resource for accurate and up to date medical information relating to rhinosinusitis. However, it may provide some useful information if mechanisms existed to direct lay people to verifiable and credible sources.

  13. Knowledge repositories for multiple uses

    NASA Technical Reports Server (NTRS)

    Williamson, Keith; Riddle, Patricia

    1991-01-01

    In the life cycle of a complex physical device or part, for example, the docking bay door of the Space Station, there are many uses for knowledge about the device or part. The same piece of knowledge might serve several uses. Given the quantity and complexity of the knowledge that must be stored, it is critical to maintain the knowledge in one repository, in one form. At the same time, because of quantity and complexity of knowledge that must be used in life cycle applications such as cost estimation, re-design, and diagnosis, it is critical to automate such knowledge uses. For each specific use, a knowledge base must be available and must be in a from that promotes the efficient performance of that knowledge base. However, without a single source knowledge repository, the cost of maintaining consistent knowledge between multiple knowledge bases increases dramatically; as facts and descriptions change, they must be updated in each individual knowledge base. A use-neutral representation of a hydraulic system for the F-111 aircraft was developed. The ability to derive portions of four different knowledge bases is demonstrated from this use-neutral representation: one knowledge base is for re-design of the device using a model-based reasoning problem solver; two knowledge bases, at different levels of abstraction, are for diagnosis using a model-based reasoning solver; and one knowledge base is for diagnosis using an associational reasoning problem solver. It was shown how updates issued against the single source use-neutral knowledge repository can be propagated to the underlying knowledge bases.

  14. Establishment and operation of a biorepository for molecular epidemiologic studies in Costa Rica.

    PubMed

    Cortés, Bernal; Schiffman, Mark; Herrero, Rolando; Hildesheim, Allan; Jiménez, Silvia; Shea, Katheryn; González, Paula; Porras, Carolina; Fallas, Greivin; Rodríguez, Ana Cecilia

    2010-04-01

    The Proyecto Epidemiológico Guanacaste (PEG) has conducted several large studies related to human papillomavirus (HPV) and cervical cancer in Guanacaste, Costa Rica in a long-standing collaboration with the U.S. National Cancer Institute. To improve molecular epidemiology efforts and save costs, we have gradually transferred technology to Costa Rica, culminating in state-of-the-art laboratories and a biorepository to support a phase III clinical trial investigating the efficacy of HPV 16/18 vaccine. Here, we describe the rationale and lessons learned in transferring molecular epidemiologic and biorepository technology to a developing country. At the outset of the PEG in the early 1990s, we shipped all specimens to repositories and laboratories in the United States, which created multiple problems. Since then, by intensive personal interactions between experts from the United States and Costa Rica, we have successfully transferred liquid-based cytology, HPV DNA testing and serology, chlamydia and gonorrhea testing, PCR-safe tissue processing, and viable cryopreservation. To accommodate the vaccine trial, a state-of-the-art repository opened in mid-2004. Approximately 15,000 to 50,000 samples are housed in the repository on any given day, and >500,000 specimens have been shipped, many using a custom-made dry shipper that permits exporting >20,000 specimens at a time. Quality control of shipments received by the NCI biorepository has revealed an error rate of <0.2%. Recently, the PEG repository has incorporated other activities; for example, large-scale aliquotting and long-term, cost-efficient storage of frozen specimens returned from the United States. Using Internet-based specimen tracking software has proven to be efficient even across borders. For long-standing collaborations, it makes sense to transfer the molecular epidemiology expertise toward the source of specimens. The successes of the PEG molecular epidemiology laboratories and biorepository prove that the physical and informatics infrastructures of a modern biorepository can be transferred to a resource-limited and weather-challenged region. Technology transfer is an important and feasible goal of international collaborations.

  15. Addressing challenges in cross-site synthesis of long-term ecological data

    USDA-ARS?s Scientific Manuscript database

    Long-term ecological datasets are becoming increasingly abundant on the internet, and are available both on websites hosted by the originating sites and/or in online repositories. While sites and networks are increasingly conforming to adopted metadata standards (which themselves continue to evolve ...

  16. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  17. Ancient Glass: A Literature Search and its Role in Waste Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Pierce, Eric M.

    2010-07-01

    When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less

  18. Evaluation of Five Sedimentary Rocks Other Than Salt for Geologic Repository Siting Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croff, A.G.; Lomenick, T.F.; Lowrie, R.S.

    The US Department of Energy (DOE), in order to increase the diversity of rock types under consideration by the geologic disposal program, initiated the Sedimary ROck Program (SERP), whose immediate objectiv eis to evaluate five types of secimdnary rock - sandstone, chalk, carbonate rocks (limestone and dolostone), anhydrock, and shale - to determine the potential for siting a geologic repository. The evaluation of these five rock types, together with the ongoing salt studies, effectively results in the consideration of all types of relatively impermeable sedimentary rock for repository purposes. The results of this evaluation are expressed in terms of amore » ranking of the five rock types with respect to their potential to serve as a geologic repository host rock. This comparative evaluation was conducted on a non-site-specific basis, by use of generic information together with rock evaluation criteria (RECs) derived from the DOE siting guidelines for geologic repositories (CFR 1984). An information base relevant to rock evaluation using these RECs was developed in hydrology, geochemistry, rock characteristics (rock occurrences, thermal response, rock mechanics), natural resources, and rock dissolution. Evaluation against postclosure and preclosure RECs yielded a ranking of the five subject rocks with respect to their potential as repository host rocks. Shale was determined to be the most preferred of the five rock types, with sandstone a distant second, the carbonate rocks and anhydrock a more distant third, and chalk a relatively close fourth.« less

  19. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A [Livermore, CA; Brinkerhoff, David L [Antioch, CA

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choung, Sungwook; Um, Wooyong; Pacific Northwest National Laboratory

    Permanent disposal of low- and intermediate-level radioactive wastes in the subterranean environment has been the preferred method of many countries, including Korea. A safety issue after the closure of a geological repository is that biodegradation of organic materials due to microbial activities generates gases that lead to overpressure of the waste containers in the repository and its disintegration with the release of radionuclides. As part of an ongoing large-scale in situ experiment using organic wastes and groundwater to simulate geological radioactive waste repository conditions, we investigated the geochemical alteration and microbial activities at an early stage (~63 days) intended tomore » be representative of the initial period after repository closure. The increased numbers of both aerobes and facultative anaerobes in waste effluents indicate that oxygen content could be the most significant parameter to control biogeochemical conditions at very early periods of reaction (<35 days). Accordingly, the values of dissolved oxygen and redox potential were decreased. The activation of anaerobes after 35 days was supported by the increased concentration to ~50 mg L-1 of ethanol. These results suggest that the biogeochemical conditions were rapidly altered to more reducing and anaerobic conditions within the initial 2 months after repository closure. Although no gases were detected during the study, activated anaerobic microbes will play more important role in gas generation over the long term.« less

  1. Bio-repository of post-clinical test samples at the national cancer center hospital (NCCH) in Tokyo.

    PubMed

    Furuta, Koh; Yokozawa, Karin; Takada, Takako; Kato, Hoichi

    2009-08-01

    We established the Bio-repository at the National Cancer Center Hospital in October 2002. The main purpose of this article is to show the importance and usefulness of a bio-repository of post-clinical test samples not only for translational cancer research but also for routine clinical oncology by introducing the experience of setting up such a facility. Our basic concept of a post-clinical test sample is not as left-over waste, but rather as frozen evidence of a patient's pathological condition at a particular point. We can decode, if not all, most of the laboratory data from a post-clinical test sample. As a result, the bio-repository is able to provide not only the samples, but potentially all related laboratory data upon request. The areas of sample coverage are the following: sera after routine blood tests; sera after cross-match tests for transfusion; serum or plasma submitted at a patient's clinically important time period by the physician; and samples collected by the individual investigator. The formats of stored samples are plasma or serum, dried blood spot (DBS) and buffy coat. So far, 150 218 plasmas or sera, 35 253 DBS and 536 buffy coats have been registered for our bio-repository system. We arranged to provide samples to various concerned parties under strict legal and ethical agreements. Although the number of the utilized samples was initially limited, the inquiries for sample utilization are now increasing steadily from both research and clinical sources. Further efforts to increase the benefits of the repository are intended.

  2. Spatial Autocorrelation, Source Water and the Distribution of Total and Viable Microbial Abundances within a Crystalline Formation to a Depth of 800 m

    PubMed Central

    Beaton, E. D.; Stuart, Marilyne; Stroes-Gascoyne, Sim; King-Sharp, Karen J.; Gurban, Ioana; Festarini, Amy; Chen, Hui Q.

    2017-01-01

    Proposed radioactive waste repositories require long residence times within deep geological settings for which we have little knowledge of local or regional subsurface dynamics that could affect the transport of hazardous species over the period of radioactive decay. Given the role of microbial processes on element speciation and transport, knowledge and understanding of local microbial ecology within geological formations being considered as host formations can aid predictions for long term safety. In this relatively unexplored environment, sampling opportunities are few and opportunistic. We combined the data collected for geochemistry and microbial abundances from multiple sampling opportunities from within a proposed host formation and performed multivariate mixing and mass balance (M3) modeling, spatial analysis and generalized linear modeling to address whether recharge can explain how subsurface communities assemble within fracture water obtained from multiple saturated fractures accessed by boreholes drilled into the crystalline formation underlying the Chalk River Laboratories site (Deep River, ON, Canada). We found that three possible source waters, each of meteoric origin, explained 97% of the samples, these are: modern recharge, recharge from the period of the Laurentide ice sheet retreat (ca. ∼12000 years before present) and a putative saline source assigned as Champlain Sea (also ca. 12000 years before present). The distributed microbial abundances and geochemistry provide a conceptual model of two distinct regions within the subsurface associated with bicarbonate – used as a proxy for modern recharge – and manganese; these regions occur at depths relevant to a proposed repository within the formation. At the scale of sampling, the associated spatial autocorrelation means that abundances linked with geochemistry were not unambiguously discerned, although fine scale Moran’s eigenvector map (MEM) coefficients were correlated with the abundance data and suggest the action of localized processes possibly associated with the manganese and sulfate content of the fracture water. PMID:28974945

  3. Medium- and long-term storage of the Pycnanthemum (Mountain mint) germplasm collection

    USDA-ARS?s Scientific Manuscript database

    The US collection of mountain mint (Pycnanthemum Michx.) is held at the USDA-ARS National Clonal Germplasm Repository (NCGR) in Corvallis, OR as seed, potted plants and tissue cultures and a long-term storage collection is preserved at the USDA-ARS National Center for Genetic Resources Preservation ...

  4. Insights: Future of the national laboratories. National Renewable Energy Laboratory. [The future of the National Renewable Energy (Sources) Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sunderman, D.

    Psychologists tell us that people are born with certain personality traits, such as shyness or boldness, which their environment can encourage, subdue, or even alter. National labs have somewhat similar characteristics. They were created for particular missions and staffed by people who built organizations in which those missions could be fulfilled. As a result, the Department of Energy's (DOE) national labs are among the world's finest repositories of technology and scientific talent, especially in the fields of defense, nuclear weapons, nuclear power, and basic energy. Sunderman, director of the National Renewable Energy Laboratory, discusses the history of the laboratory andmore » its place in the future, both in terms of technologies and nurturing.« less

  5. Introduction to geospatial semantics and technology workshop handbook

    USGS Publications Warehouse

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  6. Coupling the Mixed Potential and Radiolysis Models for Used Fuel Degradation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buck, Edgar C.; Jerden, James L.; Ebert, William L.

    The primary purpose of this report is to describe the strategy for coupling three process level models to produce an integrated Used Fuel Degradation Model (FDM). The FDM, which is based on fundamental chemical and physical principals, provides direct calculation of radionuclide source terms for use in repository performance assessments. The G-value for H2O2 production (Gcond) to be used in the Mixed Potential Model (MPM) (H2O2 is the only radiolytic product presently included but others will be added as appropriate) needs to account for intermediate spur reactions. The effects of these intermediate reactions on [H2O2] are accounted for in themore » Radiolysis Model (RM). This report details methods for applying RM calculations that encompass the effects of these fast interactions on [H2O2] as the solution composition evolves during successive MPM iterations and then represent the steady-state [H2O2] in terms of an “effective instantaneous or conditional” generation value (Gcond). It is anticipated that the value of Gcond will change slowly as the reaction progresses through several iterations of the MPM as changes in the nature of fuel surface occur. The Gcond values will be calculated with the RM either after several iterations or when concentrations of key reactants reach threshold values determined from previous sensitivity runs. Sensitivity runs with RM indicate significant changes in G-value can occur over narrow composition ranges. The objective of the mixed potential model (MPM) is to calculate the used fuel degradation rates for a wide range of disposal environments to provide the source term radionuclide release rates for generic repository concepts. The fuel degradation rate is calculated for chemical and oxidative dissolution mechanisms using mixed potential theory to account for all relevant redox reactions at the fuel surface, including those involving oxidants produced by solution radiolysis and provided by the radiolysis model (RM). The RM calculates the concentration of species generated at any specific time and location from the surface of the fuel. Several options being considered for coupling the RM and MPM are described in the report. Different options have advantages and disadvantages based on the extent of coding that would be required and the ease of use of the final product.« less

  7. Molecular hydrogen: An abundant energy source for bacterial activity in nuclear waste repositories

    NASA Astrophysics Data System (ADS)

    Libert, M.; Bildstein, O.; Esnault, L.; Jullien, M.; Sellier, R.

    A thorough understanding of the energy sources used by microbial systems in the deep terrestrial subsurface is essential since the extreme conditions for life in deep biospheres may serve as a model for possible life in a nuclear waste repository. In this respect, H 2 is known as one of the most energetic substrates for deep terrestrial subsurface environments. This hydrogen is produced from abiotic and biotic processes but its concentration in natural systems is usually maintained at very low levels due to hydrogen-consuming bacteria. A significant amount of H 2 gas will be produced within deep nuclear waste repositories, essentially from the corrosion of metallic components. This will consequently improve the conditions for microbial activity in this specific environment. This paper discusses different study cases with experimental results to illustrate the fact that microorganisms are able to use hydrogen for redox processes (reduction of O 2, NO3-, Fe III) in several waste disposal conditions. Consequences of microbial activity include: alteration of groundwater chemistry and shift in geochemical equilibria, gas production or consumption, biocorrosion, and potential modifications of confinement properties. In order to quantify the impact of hydrogen bacteria, the next step will be to determine the kinetic rate of the reactions in realistic conditions.

  8. Seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel

    NASA Astrophysics Data System (ADS)

    Kaláb, Zdeněk; Šílený, Jan; Lednická, Markéta

    2017-07-01

    This paper deals with the seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel in the Czech Republic. The basic source of data for historical earthquakes up to 1990 was the seismic website [1-]. The most intense earthquake described occurred on September 15, 1590 in the Niederroesterreich region (Austria) in the historical period; its reported intensity is Io = 8-9. The source of the contemporary seismic data for the period since 1991 to the end of 2014 was the website [11]. It may be stated based on the databases and literature review that in the period from 1900, no earthquake exceeding magnitude 5.1 originated in the territory of the Czech Republic. In order to evaluate seismicity and to assess the impact of seismic effects at depths of hypothetical deep geological repository for the next time period, the neo-deterministic method was selected as an extension of the probabilistic method. Each one out of the seven survey areas were assessed by the neo-deterministic evaluation of the seismic wave-field excited by selected individual events and determining the maximum loading. Results of seismological databases studies and neo-deterministic analysis of Čihadlo locality are presented.

  9. Opportunities for the Multi Recycling of Used MOX Fuel in the US - 12122

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, P.; Bailly, F.; Bouvier, E.

    Over the last 50 years the US has accumulated an inventory of used nuclear fuel (UNF) in the region of 64,000 metric tons in 2010, and adds an additional 2,200 metric tons each year from the current fleet of 104 Light Water Reactors. This paper considers a fuel cycle option that would be available for a future pilot U.S. recycling plant that could take advantage of the unique opportunities offered by the age and size of the large U.S. UNF inventory. For the purpose of this scenario, recycling of UNF must use the available reactor infrastructure, currently LWR's, and themore » main product of recycling is considered to be plutonium (Pu), recycled into MOX fuel for use in these reactors. Use of MOX fuels must provide the service (burn-up) expected by the reactor operator, with the required level of safety. To do so, the fissile material concentration (Pu-239, Pu-241) in the MOX must be high enough to maintain criticality, while, in current recycle facilities, the Pu-238 content has to be kept low enough to prevent excessive heat load, neutron emission, and neutron capture during recycle operations. In most countries, used MOX fuel (MOX UNF) is typically stored after one irradiation in an LWR, pending the development of the GEN IV reactors, since it is considered difficult to directly reuse the recycled MOX fuel in LWRs due to the degraded Pu fissile isotopic composition. In the US, it is possible to blend MOX UNF with LEUOx UNF from the large inventory, using the oldest UNF first. Blending at the ratio of about one MOX UNF assembly with 15 LEUOx UNF assemblies, would achieve a fissile plutonium concentration sufficient for reirradiation in new MOX fuel. The Pu-238 yield in the new fuel will be sufficiently low to meet current fuel fabrication standards. Therefore, it should be possible in the context of the US, for discharged MOX fuel to be recycled back into LWR's, using only technologies already industrially deployed worldwide. Building on that possibility, two scenarios are assessed where current US inventory is treated; Pu recycled in LWR MOX fuels, and used MOX fuels themselves are treated in a continuous partitioning-transmutation mode (case 2a) or until the whole current UNF inventory (64,000 MT in 2010) has been treated followed by disposal of the MOX UNF to a geologic repository (case 2b). In the recycling scenario, two cases (2a and 2b) are considered. Benefits achieved are compared with the once through scenario (case 1) where UNF in the current US inventory are disposed directly to a geologic repository. For each scenario, the heat load and radioactivity of the high activity wastes disposed to a geologic repository are calculated and the savings in natural resources quantified, and compared with the once-through fuel cycle. Assuming an initial pilot recycling facility with a capacity of 800 metric tons a year of heavy metal begins operation in 2030, ∼8 metric tons per year of Pu is recovered from the LEUOx UNF inventory, and is used to produce fresh MOX fuels. At a later time, additional treatment and recycling capacities are assumed to begin operation, to accommodate blending and recycling of used MOX Pu, up to 2,400 MT/yr treatment capacity to enable processing UNF slightly faster than the rate of generation. Results of this scenario analysis study show the flexibility of the recycling scenarios so that Pu is managed in a way that avoids accumulating used MOX fuels. If at some future date, the decision is made to dispose of the MOX UNF to a geologic repository (case 2b), the scenario is neutral to final repository heat load in comparison to the direct disposal of all UNF (case 1), while diminishing use of natural uranium, enrichment, UNF accumulation, and the volume of HLW. Further recycling of Pu at the end of the scenario (case 2a) would exhibit further benefits. As expected, Pu-241 and Am-241 are the source of long term HLW heat load and Am-241 and Np-237 are the source of long term radiotoxicity. When advanced technology is available, introduction of minor actinide recycling, in addition to Pu recycling, by the end of this scenario, or sooner, would have a major impact on final repository heat load and long term radiotoxicity of the HLW. This scenario is also compatible with a gradual introduction of a small number of FR's for Pu management. (authors)« less

  10. The Biological Reference Repository (BioR): a rapid and flexible system for genomics annotation.

    PubMed

    Kocher, Jean-Pierre A; Quest, Daniel J; Duffy, Patrick; Meiners, Michael A; Moore, Raymond M; Rider, David; Hossain, Asif; Hart, Steven N; Dinu, Valentin

    2014-07-01

    The Biological Reference Repository (BioR) is a toolkit for annotating variants. BioR stores public and user-specific annotation sources in indexed JSON-encoded flat files (catalogs). The BioR toolkit provides the functionality to combine and retrieve annotation from these catalogs via the command-line interface. Several catalogs from commonly used annotation sources and instructions for creating user-specific catalogs are provided. Commands from the toolkit can be combined with other UNIX commands for advanced annotation processing. We also provide instructions for the development of custom annotation pipelines. The package is implemented in Java and makes use of external tools written in Java and Perl. The toolkit can be executed on Mac OS X 10.5 and above or any Linux distribution. The BioR application, quickstart, and user guide documents and many biological examples are available at http://bioinformaticstools.mayo.edu. © The Author 2014. Published by Oxford University Press.

  11. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  12. 49 CFR Appendix B to Part 564 - Information To Be Submitted for Long Life Replaceable Light Sources of Limited Definition

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... light sources used in motor vehicle headlighting systems. This part also serves as a repository for... standardized sealed beam units used in motor vehicle headlighting systems. § 564.2 Purposes. The purposes of... manufacturing specifications of standardized sealed beam headlamp units used on motor vehicles so that all...

  13. Glass corrosion in natural environments

    NASA Technical Reports Server (NTRS)

    Thorpe, Arthur N.; Barkatt, Aaron

    1992-01-01

    Experiments carried out during the progress period are summarized. Experiments carried out involving glass samples exposed to solutions of Tris have shown the appearance of 'spikes' upon monitoring glass dissolution as a function of time. The periodic 'spikes' observed in Tris-based media were interpreted in terms of cracking due to excessive stress in the surface region of the glass. Studies of the interactions of silicate glasses with metal ions in buffered media were extended to systems containing Al. Caps buffer was used to establish the pH. The procedures used are described and the results are given. Preliminary studies were initiated as to the feasibility of adding a slowly dissolving solid compound of the additive to the glass-water system to maintain a supply of dissolved additive. It appears that several magnesium compounds have a suitable combination of solubility and affinity towards silicate glass surfaces to have a pronounced retarding effect on the extraction of uranium from the glass. These preliminary findings raise the possibility that introducing a magnesium source into geologic repositories for nuclear waste glass in the form of a sparingly soluble Mg-based backfill material may cause a substantial reduction in the extent of long-term glass corrosion. The studies described also provide mechanistic understanding of the roles of various metal solutes in the leachant. Such understanding forms the basis for developing long-term predictions of nuclear waste glass durability under repository conditions. From what is known about natural highly reduced glasses such as tektites, it is clear that iron is dissolved as ferrous iron with little or no ferric iron. The reducing conditions were high enough to cause metallic iron to exsolve out of the glass in the form of submicroscopic spherules. As the nuclear waste glass is much less reduced, a study was initiated on other natural glasses in addition to the nuclear waste glass. Extensive measurements were carried out on these glasses in order to characterize their magnetic properties. Results of these studies are described.

  14. The Astrobiology Habitable Environments Database (AHED)

    NASA Astrophysics Data System (ADS)

    Lafuente, B.; Stone, N.; Downs, R. T.; Blake, D. F.; Bristow, T.; Fonda, M.; Pires, A.

    2015-12-01

    The Astrobiology Habitable Environments Database (AHED) is a central, high quality, long-term searchable repository for archiving and collaborative sharing of astrobiologically relevant data, including, morphological, textural and contextural images, chemical, biochemical, isotopic, sequencing, and mineralogical information. The aim of AHED is to foster long-term innovative research by supporting integration and analysis of diverse datasets in order to: 1) help understand and interpret planetary geology; 2) identify and characterize habitable environments and pre-biotic/biotic processes; 3) interpret returned data from present and past missions; 4) provide a citable database of NASA-funded published and unpublished data (after an agreed-upon embargo period). AHED uses the online open-source software "The Open Data Repository's Data Publisher" (ODR - http://www.opendatarepository.org) [1], which provides a user-friendly interface that research teams or individual scientists can use to design, populate and manage their own database according to the characteristics of their data and the need to share data with collaborators or the broader scientific community. This platform can be also used as a laboratory notebook. The database will have the capability to import and export in a variety of standard formats. Advanced graphics will be implemented including 3D graphing, multi-axis graphs, error bars, and similar scientific data functions together with advanced online tools for data analysis (e. g. the statistical package, R). A permissions system will be put in place so that as data are being actively collected and interpreted, they will remain proprietary. A citation system will allow research data to be used and appropriately referenced by other researchers after the data are made public. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, Mars Science Laboratory Investigations. [1] Nate et al. (2015) AGU, submitted.

  15. Preliminary evaluation of solution-mining intrusion into a salt-dome repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-06-01

    This report is the product of the work of an ONWI task force to evaluate inadvertant human intrusion into a salt dome repository by solution mining. It summarizes the work in the following areas: a general review of the levels of defense that could reduce both the likelihood and potential consequences of human intrusion into a salt dome repository; evaluation of a hypothetical intrusion scenario and its consequences; recommendation for further studies. The conclusions of this task force report can be summarized as follows: (1) it is not possible at present to establish with certainty that solution mining is crediblemore » as a human-intrusion event. The likelihood of such an intrusion will depend on the effectiveness of the preventive measures; (2) an example analysis based on the realistic approach is presented in this report; it concluded that the radiological consequences are strongly dependent upon the mode of radionuclide release from the waste form, time after emplacement, package design, impurities in the host salt, the amount of a repository intercepted, the solution mining cavity form, the length of time over which solution mining occurs, the proportion of contaminated salt source for human consumption compared to other sources, and the method of salt purification for culinary purposes; (3) worst case scenarios done by other studies suggest considerable potential for exposures to man while preliminary evaluations of more realistic cases suggest significantly reduced potential consequences. Mathematical model applications to process systems, guided by more advanced assumptions about human intrusion into geomedia, will shed more light on the potential for concerns and the degree to which mitigative measures will be required.« less

  16. Using FEP's List and a PA Methodology for Evaluating Suitable Areas for the LLW Repository in Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risoluti, P.; Ciabatti, P.; Mingrone, G.

    2002-02-26

    In Italy following a referendum held in 1987, nuclear energy has been phased out. Since 1998, a general site selection process covering the whole Italian territory has been under way. A GIS (Geographic Information System) methodology was implemented in three steps using the ESRI Arc/Info and Arc/View platforms. The screening identified approximately 0.8% of the Italian territory as suitable for locating the LLW Repository. 200 areas have been identified as suitable for the location of the LLW Repository, using a multiple exclusion criteria procedure (1:500,000), regional scale (1:100.000) and local scale (1:25,000-1:10,000). A methodology for evaluating these areas has beenmore » developed allowing, along with the evaluation of the long term efficiency of the engineered barrier system (EBS), the characterization of the selected areas in terms of physical and safety factors and planning factors. The first step was to identify, on a referenced FEPs list, a group of geomorphological, geological, hydrogeological, climatic and human behavior caused process and/or events, which were considered of importance for the site evaluation, taking into account the Italian situation. A site evaluation system was established ascribing weighted scores to each of these processes and events, which were identified as parameters of the new evaluation system. The score of each parameter is ranging from 1 (low suitability) to 3 (high suitability). The corresponding weight is calculated considering the effect of the parameter in terms of total dose to the critical group, using an upgraded AMBER model for PA calculation. At the end of the process an index obtained by a score weighted sum gives the degree of suitability of the selected areas for the LLW Repository location. The application of the methodology to two selected sites is given in the paper.« less

  17. Warehousing re-annotated cancer genes for biomarker meta-analysis.

    PubMed

    Orsini, M; Travaglione, A; Capobianco, E

    2013-07-01

    Translational research in cancer genomics assigns a fundamental role to bioinformatics in support of candidate gene prioritization with regard to both biomarker discovery and target identification for drug development. Efforts in both such directions rely on the existence and constant update of large repositories of gene expression data and omics records obtained from a variety of experiments. Users who interactively interrogate such repositories may have problems in retrieving sample fields that present limited associated information, due for instance to incomplete entries or sometimes unusable files. Cancer-specific data sources present similar problems. Given that source integration usually improves data quality, one of the objectives is keeping the computational complexity sufficiently low to allow an optimal assimilation and mining of all the information. In particular, the scope of integrating intraomics data can be to improve the exploration of gene co-expression landscapes, while the scope of integrating interomics sources can be that of establishing genotype-phenotype associations. Both integrations are relevant to cancer biomarker meta-analysis, as the proposed study demonstrates. Our approach is based on re-annotating cancer-specific data available at the EBI's ArrayExpress repository and building a data warehouse aimed to biomarker discovery and validation studies. Cancer genes are organized by tissue with biomedical and clinical evidences combined to increase reproducibility and consistency of results. For better comparative evaluation, multiple queries have been designed to efficiently address all types of experiments and platforms, and allow for retrieval of sample-related information, such as cell line, disease state and clinical aspects. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Projected environmental impacts of radioactive material transportation to the first US repository site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuhauser, K.S.; Cashwell, J.W.; Reardon, P.C.

    1986-12-31

    This paper discusses the relative national environmental impacts of transporting nuclear wastes to each of the nine candidate repository sites in the United States. Several of the potential sites are closely clustered and, for the purpose of distance and routing calculations, are treated as a single location. These are: Cypress Creek Dome and Richton Dome in Mississippi (Gulf Interior Region), Deaf Smith County and Swisher County sites in Texas (Permian Basin), and Davis Canyon and Lavender Canyon site in Utah (Paradox Basin). The remaining sites are: Vacherie Dome, Louisiana; Yucca Mountain, Nevada; and Hanford Reservation, Washington. For compatibility with bothmore » the repository system authorized by the NWPA and with the MRS option, two separate scenarios were analyzed. In belief, they are (1) shipment of spent fuel and high-level wastes (HLW) directly from waste generators to a repository (Reference Case) and (2) shipment of spent fuel to a Monitored Retrievable Storage (MRS) facility and then to a repository. Between 17 and 38 truck accident fatalities, between 1.4 and 7.7 rail accident fatalities, and between 0.22 and 12 radiological health effects can be expected to occur as a result of radioactive material transportation during the 26-year operating period of the first repository. During the same period in the United States, about 65,000 total deaths from truck accidents and about 32,000 total deaths from rail accidents would occur; also an estimated 58,300 cancer fatalities are predicted to occur in the United States during a 26-year period from exposure to background radiation alone (not including medical and other manmade sources). The risks reported here are upper limits and are small by comparison with the "natural background" of risks of the same type. 3 refs., 6 tabs.« less

  19. Optimizing Data Center Services to Foster Stewardship and Use of Geospatial Data by Heterogeneous Populations of Users

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; de Sherbinin, A. M.

    2017-12-01

    Growing recognition of the importance of sharing scientific data more widely and openly has refocused attention on the state of data repositories, including both discipline- or topic-oriented data centers and institutional repositories. Data creators often have several alternatives for depositing and disseminating their natural, social, health, or engineering science data. In selecting a repository for their data, data creators and other stakeholders such as their funding agencies may wish to consider the user community or communities served, the type and quality of data products already offered, and the degree of data stewardship and associated services provided. Some data repositories serve general communities, e.g., those in their host institution or region, whereas others tailor their services to particular scientific disciplines or topical areas. Some repositories are selective when acquiring data and conduct extensive curation and reviews to ensure that data products meet quality standards. Many repositories have secured credentials and established a track record for providing trustworthy, high quality data and services. The NASA Socioeconomic Data and Applications Center (SEDAC) serves users interested in human-environment interactions, including researchers, students, and applied users from diverse sectors. SEDAC is selective when choosing data for dissemination, conducting several reviews of data products and services prior to release. SEDAC works with data producers to continually improve the quality of its open data products and services. As a Distributed Active Archive Center (DAAC) of the NASA Earth Observing System Data and Information System, SEDAC is committed to improving the accessibility, interoperability, and usability of its data in conjunction with data available from other DAACs, as well as other relevant data sources. SEDAC is certified as a Regular Member of the International Council for Science World Data System (ICSU-WDS).

  20. SchizConnect: Mediating neuroimaging databases on schizophrenia and related disorders for large-scale integration.

    PubMed

    Wang, Lei; Alpert, Kathryn I; Calhoun, Vince D; Cobia, Derin J; Keator, David B; King, Margaret D; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D; Potkin, Steven G; Turner, Jessica A; Ambite, Jose Luis

    2016-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation--translating across data sources--so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information is being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Publishing descriptions of non-public clinical datasets: proposed guidance for researchers, repositories, editors and funding organisations.

    PubMed

    Hrynaszkiewicz, Iain; Khodiyar, Varsha; Hufton, Andrew L; Sansone, Susanna-Assunta

    2016-01-01

    Sharing of experimental clinical research data usually happens between individuals or research groups rather than via public repositories, in part due to the need to protect research participant privacy. This approach to data sharing makes it difficult to connect journal articles with their underlying datasets and is often insufficient for ensuring access to data in the long term. Voluntary data sharing services such as the Yale Open Data Access (YODA) and Clinical Study Data Request (CSDR) projects have increased accessibility to clinical datasets for secondary uses while protecting patient privacy and the legitimacy of secondary analyses but these resources are generally disconnected from journal articles-where researchers typically search for reliable information to inform future research. New scholarly journal and article types dedicated to increasing accessibility of research data have emerged in recent years and, in general, journals are developing stronger links with data repositories. There is a need for increased collaboration between journals, data repositories, researchers, funders, and voluntary data sharing services to increase the visibility and reliability of clinical research. Using the journal Scientific Data as a case study, we propose and show examples of changes to the format and peer-review process for journal articles to more robustly link them to data that are only available on request. We also propose additional features for data repositories to better accommodate non-public clinical datasets, including Data Use Agreements (DUAs).

  2. Citing a Data Repository: A Case Study of the Protein Data Bank

    PubMed Central

    Huang, Yi-Hung; Rose, Peter W.; Hsu, Chun-Nan

    2015-01-01

    The Protein Data Bank (PDB) is the worldwide repository of 3D structures of proteins, nucleic acids and complex assemblies. The PDB’s large corpus of data (> 100,000 structures) and related citations provide a well-organized and extensive test set for developing and understanding data citation and access metrics. In this paper, we present a systematic investigation of how authors cite PDB as a data repository. We describe a novel metric based on information cascade constructed by exploring the citation network to measure influence between competing works and apply that to analyze different data citation practices to PDB. Based on this new metric, we found that the original publication of RCSB PDB in the year 2000 continues to attract most citations though many follow-up updates were published. None of these follow-up publications by members of the wwPDB organization can compete with the original publication in terms of citations and influence. Meanwhile, authors increasingly choose to use URLs of PDB in the text instead of citing PDB papers, leading to disruption of the growth of the literature citations. A comparison of data usage statistics and paper citations shows that PDB Web access is highly correlated with URL mentions in the text. The results reveal the trend of how authors cite a biomedical data repository and may provide useful insight of how to measure the impact of a data repository. PMID:26317409

  3. Citing a Data Repository: A Case Study of the Protein Data Bank.

    PubMed

    Huang, Yi-Hung; Rose, Peter W; Hsu, Chun-Nan

    2015-01-01

    The Protein Data Bank (PDB) is the worldwide repository of 3D structures of proteins, nucleic acids and complex assemblies. The PDB's large corpus of data (> 100,000 structures) and related citations provide a well-organized and extensive test set for developing and understanding data citation and access metrics. In this paper, we present a systematic investigation of how authors cite PDB as a data repository. We describe a novel metric based on information cascade constructed by exploring the citation network to measure influence between competing works and apply that to analyze different data citation practices to PDB. Based on this new metric, we found that the original publication of RCSB PDB in the year 2000 continues to attract most citations though many follow-up updates were published. None of these follow-up publications by members of the wwPDB organization can compete with the original publication in terms of citations and influence. Meanwhile, authors increasingly choose to use URLs of PDB in the text instead of citing PDB papers, leading to disruption of the growth of the literature citations. A comparison of data usage statistics and paper citations shows that PDB Web access is highly correlated with URL mentions in the text. The results reveal the trend of how authors cite a biomedical data repository and may provide useful insight of how to measure the impact of a data repository.

  4. Scalable Metadata Management for a Large Multi-Source Seismic Data Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaylord, J. M.; Dodge, D. A.; Magana-Zook, S. A.

    In this work, we implemented the key metadata management components of a scalable seismic data ingestion framework to address limitations in our existing system, and to position it for anticipated growth in volume and complexity.

  5. MEASURING THE ACUTE TOXICITY OF ESTUARINE SEDIMENTS

    EPA Science Inventory

    Estuarine sediments frequently are repositories and sources of anthropogenic contaminants. Toxicity is one method of assessing the environmental quality of sediments, yet because of the extreme range of salinities that characterize estuaries few infaunal organisms have both the p...

  6. Long-Term Modeling of Coupled Processes in a Generic Salt Repository for Heat-Generating Nuclear Waste: Analysis of the Impacts of Halite Solubility Constraints

    NASA Astrophysics Data System (ADS)

    Blanco Martin, L.; Rutqvist, J.; Battistelli, A.; Birkholzer, J. T.

    2015-12-01

    Rock salt is a potential medium for the underground disposal of nuclear waste because it has several assets, such as its ability to creep and heal fractures and its water and gas tightness in the undisturbed state. In this research, we focus on disposal of heat-generating nuclear waste and we consider a generic salt repository with in-drift emplacement of waste packages and crushed salt backfill. As the natural salt creeps, the crushed salt backfill gets progressively compacted and an engineered barrier system is subsequently created [1]. The safety requirements for such a repository impose that long time scales be considered, during which the integrity of the natural and engineered barriers have to be demonstrated. In order to evaluate this long-term integrity, we perform numerical modeling based on state-of-the-art knowledge. Here, we analyze the impacts of halite dissolution and precipitation within the backfill and the host rock. For this purpose, we use an enhanced equation-of-state module of TOUGH2 that properly includes temperature-dependent solubility constraints [2]. We perform coupled thermal-hydraulic-mechanical modeling and we investigate the influence of the mentioned impacts. The TOUGH-FLAC simulator, adapted for large strains and creep, is used [3]. In order to quantify the importance of salt dissolution and precipitation on the effective porosity, permeability, pore pressure, temperature and stress field, we compare numerical results that include or disregard fluids of variable salinity. The sensitivity of the results to some parameters, such as the initial saturation within the backfill, is also addressed. References: [1] Bechthold, W. et al. Backfilling and Sealing of Underground Repositories for Radioactive Waste in Salt (BAMBUS II Project). Report EUR20621 EN: European Atomic Energy Community, 2004. [2] Battistelli A. Improving the treatment of saline brines in EWASG for the simulation of hydrothermal systems. Proceedings, TOUGH Symposium 2012, Lawrence Berkeley National Laboratory, Berkeley, California, Sept. 17-19, 2012. [3] Blanco-Martín L, Rutqvist J, Birkholzer JT. Long-term modelling of the thermal-hydraulic-mechanical response of a generic salt repository for heat generating nuclear waste. Eng Geol 2015;193:198-211. doi:10.1016/j.enggeo.2015.04.014.

  7. Accessing and integrating data and knowledge for biomedical research.

    PubMed

    Burgun, A; Bodenreider, O

    2008-01-01

    To review the issues that have arisen with the advent of translational research in terms of integration of data and knowledge, and survey current efforts to address these issues. Using examples form the biomedical literature, we identified new trends in biomedical research and their impact on bioinformatics. We analyzed the requirements for effective knowledge repositories and studied issues in the integration of biomedical knowledge. New diagnostic and therapeutic approaches based on gene expression patterns have brought about new issues in the statistical analysis of data, and new workflows are needed are needed to support translational research. Interoperable data repositories based on standard annotations, infrastructures and services are needed to support the pooling and meta-analysis of data, as well as their comparison to earlier experiments. High-quality, integrated ontologies and knowledge bases serve as a source of prior knowledge used in combination with traditional data mining techniques and contribute to the development of more effective data analysis strategies. As biomedical research evolves from traditional clinical and biological investigations towards omics sciences and translational research, specific needs have emerged, including integrating data collected in research studies with patient clinical data, linking omics knowledge with medical knowledge, modeling the molecular basis of diseases, and developing tools that support in-depth analysis of research data. As such, translational research illustrates the need to bridge the gap between bioinformatics and medical informatics, and opens new avenues for biomedical informatics research.

  8. Far-Field Accumulation of Fissile Material From Waste Packages Containing Plutonium Disposition Waste Form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.P. Nicot

    The objective of this calculation is to estimate the quantity of fissile material that could accumulate in fractures in the rock beneath plutonium-ceramic (Pu-ceramic) and Mixed-Oxide (MOX) waste packages (WPs) as they degrade in the potential monitored geologic repository at Yucca Mountain. This calculation is to feed another calculation (Ref. 31) computing the probability of criticality in the systems described in Section 6 and then ultimately to a more general report on the impact of plutonium on the performance of the proposed repository (Ref. 32), both developed concurrently to this work. This calculation is done in accordance with the developmentmore » plan TDP-DDC-MD-000001 (Ref. 9), item 5. The original document described in item 5 has been split into two documents: this calculation and Ref. 4. The scope of the calculation is limited to only very low flow rates because they lead to the most conservative cases for Pu accumulation and more generally are consistent with the way the effluent from the WP (called source term in this calculation) was calculated (Ref. 4). Ref. 4 (''In-Drift Accumulation of Fissile Material from WPs Containing Plutonium Disposition Waste Forms'') details the evolution through time (breach time is initial time) of the chemical composition of the solution inside the WP as degradation of the fuel and other materials proceed. It is the chemical solution used as a source term in this calculation. Ref. 4 takes that same source term and reacts it with the invert; this calculation reacts it with the rock. In addition to reactions with the rock minerals (that release Si and Ca), the basic mechanisms for actinide precipitation are dilution and mixing with resident water as explained in Section 2.1.4. No other potential mechanism such as flow through a reducing zone is investigated in this calculation. No attempt was made to use the effluent water from the bottom of the invert instead of using directly the effluent water from the WP. This calculation supports disposal criticality analysis and has been prepared in accordance with AP-3.12Q, Calculations (Ref. 49). This calculation uses results from Ref. 4 on actinide accumulation in the invert and more generally does reference heavily the cited calculation. In addition to the information provided in this calculation, the reader is referred to the cited calculation for a more thorough treatment of items applying to both the invert and fracture system such as the choice of the thermodynamic database, the composition of J-13 well water, tuff composition, dissolution rate laws, Pu(OH){sub 4} solubility and also for details on the source term composition. The flow conditions (seepage rate, water velocity in fractures) in the drift and the fracture system beneath initially referred to the TSPA-VA because this work was prepared before the release of the work feeding the TSPA-SR. Some new information feeding the TSPA-SR has since been included. Similarly, the soon-to-be-qualified thermodynamic database data0.ymp has not been released yet.« less

  9. Bridging the Gap: Need for a Data Repository to Support Vaccine Prioritization Efforts*

    PubMed Central

    Madhavan, Guruprasad; Phelps, Charles; Sangha, Kinpritma; Levin, Scott; Rappuoli, Rino

    2015-01-01

    As the mechanisms for discovery, development, and delivery of new vaccines become increasingly complex, strategic planning and priority setting have become ever more crucial. Traditional single value metrics such as disease burden or cost-effectiveness no longer suffice to rank vaccine candidates for development. The Institute of Medicine—in collaboration with the National Academy of Engineering—has developed a novel software system to support vaccine prioritization efforts. The Strategic Multi-Attribute Ranking Tool for Vaccines—SMART Vaccines—allows decision makers to specify their own value structure, selecting from among 28 pre-defined and up to 7 user-defined attributes relevant to the ranking of vaccine candidates. Widespread use of SMART Vaccines will require compilation of a comprehensive data repository for numerous relevant populations—including their demographics, disease burdens and associated treatment costs, as well as characterizing performance features of potential or existing vaccines that might be created, improved, or deployed. While the software contains preloaded data for a modest number of populations, a large gap exists between the existing data and a comprehensive data repository necessary to make full use of SMART Vaccines. While some of these data exist in disparate sources and forms, constructing a data repository will require much new coordination and focus. Finding strategies to bridge the gap to a comprehensive data repository remains the most important task in bringing SMART Vaccines to full fruition, and to support strategic vaccine prioritization efforts in general. PMID:26022565

  10. A strategy to establish Food Safety Model Repositories.

    PubMed

    Plaza-Rodríguez, C; Thoens, C; Falenski, A; Weiser, A A; Appel, B; Kaesbohrer, A; Filter, M

    2015-07-02

    Transferring the knowledge of predictive microbiology into real world food manufacturing applications is still a major challenge for the whole food safety modelling community. To facilitate this process, a strategy for creating open, community driven and web-based predictive microbial model repositories is proposed. These collaborative model resources could significantly improve the transfer of knowledge from research into commercial and governmental applications and also increase efficiency, transparency and usability of predictive models. To demonstrate the feasibility, predictive models of Salmonella in beef previously published in the scientific literature were re-implemented using an open source software tool called PMM-Lab. The models were made publicly available in a Food Safety Model Repository within the OpenML for Predictive Modelling in Food community project. Three different approaches were used to create new models in the model repositories: (1) all information relevant for model re-implementation is available in a scientific publication, (2) model parameters can be imported from tabular parameter collections and (3) models have to be generated from experimental data or primary model parameters. All three approaches were demonstrated in the paper. The sample Food Safety Model Repository is available via: http://sourceforge.net/projects/microbialmodelingexchange/files/models and the PMM-Lab software can be downloaded from http://sourceforge.net/projects/pmmlab/. This work also illustrates that a standardized information exchange format for predictive microbial models, as the key component of this strategy, could be established by adoption of resources from the Systems Biology domain. Copyright © 2015. Published by Elsevier B.V.

  11. Automated Report Generation for Research Data Repositories: From i2b2 to PDF.

    PubMed

    Thiemann, Volker S; Xu, Tingyan; Röhrig, Rainer; Majeed, Raphael W

    2017-01-01

    We developed an automated toolchain to generate reports of i2b2 data. It is based on free open source software and runs on a Java Application Server. It is sucessfully used in an ED registry project. The solution is highly configurable and portable to other projects based on i2b2 or compatible factual data sources.

  12. Limitations on scientific prediction and how they could affect repository licensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Konynenburg, R.A.

    The best possibility for gaining an understanding of the likely future behavior of a high level nuclear waste disposal system is to use the scientific method. However, the scientific approach has inherent limitations when it comes to making long-term predictions with confidence. This paper examines some of these limiting factors as well as the criteria for admissibility of scientific evidence in the legal arena, and concludes that the prospects are doubtful for successful licensing of a potential repository under the regulations that are now being reconsidered. Suggestions am made for remedying this situation.

  13. A Semantically Enabled Metadata Repository for Solar Irradiance Data Products

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Cox, M.; Lindholm, D. M.; Nadiadi, I.; Traver, T.

    2014-12-01

    The Laboratory for Atmospheric and Space Physics, LASP, has been conducting research in Atmospheric and Space science for over 60 years, and providing the associated data products to the public. LASP has a long history, in particular, of making space-based measurements of the solar irradiance, which serves as crucial input to several areas of scientific research, including solar-terrestrial interactions, atmospheric, and climate. LISIRD, the LASP Interactive Solar Irradiance Data Center, serves these datasets to the public, including solar spectral irradiance (SSI) and total solar irradiance (TSI) data. The LASP extended metadata repository, LEMR, is a database of information about the datasets served by LASP, such as parameters, uncertainties, temporal and spectral ranges, current version, alerts, etc. It serves as the definitive, single source of truth for that information. The database is populated with information garnered via web forms and automated processes. Dataset owners keep the information current and verified for datasets under their purview. This information can be pulled dynamically for many purposes. Web sites such as LISIRD can include this information in web page content as it is rendered, ensuring users get current, accurate information. It can also be pulled to create metadata records in various metadata formats, such as SPASE (for heliophysics) and ISO 19115. Once these records are be made available to the appropriate registries, our data will be discoverable by users coming in via those organizations. The database is implemented as a RDF triplestore, a collection of instances of subject-object-predicate data entities identifiable with a URI. This capability coupled with SPARQL over HTTP read access enables semantic queries over the repository contents. To create the repository we leveraged VIVO, an open source semantic web application, to manage and create new ontologies and populate repository content. A variety of ontologies were used in creating the triplestore, including ontologies that came with VIVO such as FOAF. Also, the W3C DCAT ontology was integrated and extended to describe properties of our data products that we needed to capture, such as spectral range. The presentation will describe the architecture, ontology issues, and tools used to create LEMR and plans for its evolution.

  14. Implementation of an OAIS Repository Using Free, Open Source Software

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Gessler, P. E.; Seamon, E.

    2015-12-01

    The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design of the repository, based upon open standards to support interoperability with other institutions' systems and with future versions of our own software components. We also describe the implementation process, including our use of GitHub as a collaboration tool and code repository.

  15. NA-42 TI Shared Software Component Library FY2011 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.

    The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less

  16. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  17. EPAs SPECIATE 4.4 Database: Development and Uses

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency’s (EPA) repository of source category-specific particulate matter (PM), volatile organic gas, and other gas speciation profiles of air pollutant emissions. Abt Associates, Inc. developed SPECIATE 4.4 through a collaborat...

  18. SPECIATE - EPA'S DATABASE OF SPECIATED EMISSION PROFILES

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA) repository of total organic compound (TOC) and particulate matter (PM) speciation profiles for emissions from air pollution sources. The data base has recently been updated and an associated report has recently been re...

  19. Saccadic Eye Movements Impose a Natural Bottleneck on Visual Short-Term Memory

    ERIC Educational Resources Information Center

    Ohl, Sven; Rolfs, Martin

    2017-01-01

    Visual short-term memory (VSTM) is a crucial repository of information when events unfold rapidly before our eyes, yet it maintains only a fraction of the sensory information encoded by the visual system. Here, we tested the hypothesis that saccadic eye movements provide a natural bottleneck for the transition of fragile content in sensory memory…

  20. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the particular collection on a long-term basis using professional museum and archival practices; and... security such as locking the items in a safe, vault or museum specimen cabinet, as appropriate; (vi... staff and any consultants who are responsible for managing and preserving the collection to be qualified...

  1. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the particular collection on a long-term basis using professional museum and archival practices; and... security such as locking the items in a safe, vault or museum specimen cabinet, as appropriate; (vi... staff and any consultants who are responsible for managing and preserving the collection to be qualified...

  2. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the particular collection on a long-term basis using professional museum and archival practices; and... security such as locking the items in a safe, vault or museum specimen cabinet, as appropriate; (vi... staff and any consultants who are responsible for managing and preserving the collection to be qualified...

  3. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the particular collection on a long-term basis using professional museum and archival practices; and... security such as locking the items in a safe, vault or museum specimen cabinet, as appropriate; (vi... staff and any consultants who are responsible for managing and preserving the collection to be qualified...

  4. The Royal Society of Chemistry and the delivery of chemistry data repositories for the community.

    PubMed

    Williams, Antony; Tkachenko, Valery

    2014-10-01

    Since 2009 the Royal Society of Chemistry (RSC) has been delivering access to chemistry data and cheminformatics tools via the ChemSpider database and has garnered a significant community following in terms of usage and contribution to the platform. ChemSpider has focused only on those chemical entities that can be represented as molecular connection tables or, to be more specific, the ability to generate an InChI from the input structure. As a structure centric hub ChemSpider is built around the molecular structure with other data and links being associated with this structure. As a result the platform has been limited in terms of the types of data that can be managed, and the flexibility of its searches, and it is constrained by the data model. New technologies and approaches, specifically taking into account a shift from relational to NoSQL databases, and the growing importance of the semantic web, has motivated RSC to rearchitect and create a more generic data repository utilizing these new technologies. This article will provide an overview of our activities in delivering data sharing platforms for the chemistry community including the development of the new data repository expanding into more extensive domains of chemistry data.

  5. The Royal Society of Chemistry and the delivery of chemistry data repositories for the community

    NASA Astrophysics Data System (ADS)

    Williams, Antony; Tkachenko, Valery

    2014-10-01

    Since 2009 the Royal Society of Chemistry (RSC) has been delivering access to chemistry data and cheminformatics tools via the ChemSpider database and has garnered a significant community following in terms of usage and contribution to the platform. ChemSpider has focused only on those chemical entities that can be represented as molecular connection tables or, to be more specific, the ability to generate an InChI from the input structure. As a structure centric hub ChemSpider is built around the molecular structure with other data and links being associated with this structure. As a result the platform has been limited in terms of the types of data that can be managed, and the flexibility of its searches, and it is constrained by the data model. New technologies and approaches, specifically taking into account a shift from relational to NoSQL databases, and the growing importance of the semantic web, has motivated RSC to rearchitect and create a more generic data repository utilizing these new technologies. This article will provide an overview of our activities in delivering data sharing platforms for the chemistry community including the development of the new data repository expanding into more extensive domains of chemistry data.

  6. Disposal of spent fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blomeke, J O; Ferguson, D E; Croff, A G

    1978-01-01

    Based on preliminary analyses, spent fuel assemblies are an acceptable form for waste disposal. The following studies appear necessary to bring our knowledge of spent fuel as a final disposal form to a level comparable with that of the solidified wastes from reprocessing: 1. A complete systems analysis is needed of spent fuel disposition from reactor discharge to final isolation in a repository. 2. Since it appears desirable to encase the spent fuel assembly in a metal canister, candidate materials for this container need to be studied. 3. It is highly likely that some ''filler'' material will be needed betweenmore » the fuel elements and the can. 4. Leachability, stability, and waste-rock interaction studies should be carried out on the fuels. The major disadvantages of spent fuel as a disposal form are the lower maximum heat loading, 60 kW/acre versus 150 kW/acre for high-level waste from a reprocessing plant; the greater long-term potential hazard due to the larger quantities of plutonium and uranium introduced into a repository; and the possibility of criticality in case the repository is breached. The major advantages are the lower cost and increased near-term safety resulting from eliminating reprocessing and the treatment and handling of the wastes therefrom.« less

  7. MetaboLights: An Open-Access Database Repository for Metabolomics Data.

    PubMed

    Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph

    2016-03-24

    MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.

  8. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  9. Initial Radionuclide Inventories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, H

    The purpose of this analysis is to provide an initial radionuclide inventory (in grams per waste package) and associated uncertainty distributions for use in the Total System Performance Assessment for the License Application (TSPA-LA) in support of the license application for the repository at Yucca Mountain, Nevada. This document is intended for use in postclosure analysis only. Bounding waste stream information and data were collected that capture probable limits. For commercially generated waste, this analysis considers alternative waste stream projections to bound the characteristics of wastes likely to be encountered using arrival scenarios that potentially impact the commercial spent nuclearmore » fuel (CSNF) waste stream. For TSPA-LA, this radionuclide inventory analysis considers U.S. Department of Energy (DOE) high-level radioactive waste (DHLW) glass and two types of spent nuclear fuel (SNF): CSNF and DOE-owned (DSNF). These wastes are placed in two groups of waste packages: the CSNF waste package and the codisposal waste package (CDSP), which are designated to contain DHLW glass and DSNF, or DHLW glass only. The radionuclide inventory for naval SNF is provided separately in the classified ''Naval Nuclear Propulsion Program Technical Support Document'' for the License Application. As noted previously, the radionuclide inventory data presented here is intended only for TSPA-LA postclosure calculations. It is not applicable to preclosure safety calculations. Safe storage, transportation, and ultimate disposal of these wastes require safety analyses to support the design and licensing of repository equipment and facilities. These analyses will require radionuclide inventories to represent the radioactive source term that must be accommodated during handling, storage and disposition of these wastes. This analysis uses the best available information to identify the radionuclide inventory that is expected at the last year of last emplacement, currently identified as 2030 and 2033, depending on the type of waste. TSPA-LA uses the results of this analysis to decay the inventory to the year of repository closure projected for the year of 2060.« less

  10. Applying the institutional review board data repository approach to manage ethical considerations in evaluating and studying medical education

    PubMed Central

    Thayer, Erin K.; Rathkey, Daniel; Miller, Marissa Fuqua; Palmer, Ryan; Mejicano, George C.; Pusic, Martin; Kalet, Adina; Gillespie, Colleen; Carney, Patricia A.

    2016-01-01

    Issue Medical educators and educational researchers continue to improve their processes for managing medical student and program evaluation data using sound ethical principles. This is becoming even more important as curricular innovations are occurring across undergraduate and graduate medical education. Dissemination of findings from this work is critical, and peer-reviewed journals often require an institutional review board (IRB) determination. Approach IRB data repositories, originally designed for the longitudinal study of biological specimens, can be applied to medical education research. The benefits of such an approach include obtaining expedited review for multiple related studies within a single IRB application and allowing for more flexibility when conducting complex longitudinal studies involving large datasets from multiple data sources and/or institutions. In this paper, we inform educators and educational researchers on our analysis of the use of the IRB data repository approach to manage ethical considerations as part of best practices for amassing, pooling, and sharing data for educational research, evaluation, and improvement purposes. Implications Fostering multi-institutional studies while following sound ethical principles in the study of medical education is needed, and the IRB data repository approach has many benefits, especially for longitudinal assessment of complex multi-site data. PMID:27443407

  11. Multi-scale groundwater flow modeling during temperate climate conditions for the safety assessment of the proposed high-level nuclear waste repository site at Forsmark, Sweden

    NASA Astrophysics Data System (ADS)

    Joyce, Steven; Hartley, Lee; Applegate, David; Hoek, Jaap; Jackson, Peter

    2014-09-01

    Forsmark in Sweden has been proposed as the site of a geological repository for spent high-level nuclear fuel, to be located at a depth of approximately 470 m in fractured crystalline rock. The safety assessment for the repository has required a multi-disciplinary approach to evaluate the impact of hydrogeological and hydrogeochemical conditions close to the repository and in a wider regional context. Assessing the consequences of potential radionuclide releases requires quantitative site-specific information concerning the details of groundwater flow on the scale of individual waste canister locations (1-10 m) as well as details of groundwater flow and composition on the scale of groundwater pathways between the facility and the surface (500 m to 5 km). The purpose of this article is to provide an illustration of multi-scale modeling techniques and the results obtained when combining aspects of local-scale flows in fractures around a potential contaminant source with regional-scale groundwater flow and transport subject to natural evolution of the system. The approach set out is novel, as it incorporates both different scales of model and different levels of detail, combining discrete fracture network and equivalent continuous porous medium representations of fractured bedrock.

  12. iT2DMS: a Standard-Based Diabetic Disease Data Repository and its Pilot Experiment on Diabetic Retinopathy Phenotyping and Examination Results Integration.

    PubMed

    Wu, Huiqun; Wei, Yufang; Shang, Yujuan; Shi, Wei; Wang, Lei; Li, Jingjing; Sang, Aimin; Shi, Lili; Jiang, Kui; Dong, Jiancheng

    2018-06-06

    Type 2 diabetes mellitus (T2DM) is a common chronic disease, and the fragment data collected through separated vendors makes continuous management of DM patients difficult. The lack of standard of fragment data from those diabetic patients also makes the further potential phenotyping based on the diabetic data difficult. Traditional T2DM data repository only supports data collection from T2DM patients, lack of phenotyping ability and relied on standalone database design, limiting the secondary usage of these valuable data. To solve these issues, we proposed a novel T2DM data repository framework, which was based on standards. This repository can integrate data from various sources. It would be used as a standardized record for further data transfer as well as integration. Phenotyping was conducted based on clinical guidelines with KNIME workflow. To evaluate the phenotyping performance of the proposed system, data was collected from local community by healthcare providers and was then tested using algorithms. The results indicated that the proposed system could detect DR cases with an average accuracy of about 82.8%. Furthermore, these results had the promising potential of addressing fragmented data. The proposed system has integrating and phenotyping abilities, which could be used for diabetes research in future studies.

  13. Native American Art and Culture: Documentary Resources.

    ERIC Educational Resources Information Center

    Lawrence, Deirdre

    1992-01-01

    Presents a brief overview of the evolution of documentary material of Native American cultures and problems confronted by researchers in locating relevant information. Bibliographic sources for research are discussed and a directory of major repositories of Native American art documentation is provided. (EA)

  14. Rolling Deck to Repository (R2R): Collaborative Development of Linked Data for Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Chandler, Cynthia; Stocks, Karen; Smith, Shawn; Clark, Paul; Shepherd, Adam; Moore, Carla; Beaulieu, Stace

    2013-04-01

    The Rolling Deck to Repository (R2R) program is developing infrastructure to ensure the underway sensor data from U.S. academic oceanographic research vessels are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. The entire R2R Catalog is published online as a Linked Data collection, making it easily accessible to encourage discovery and integration with data at other repositories. We are developing the R2R Linked Data collection with specific goals in mind: 1.) We facilitate data access and reuse by publishing the richest possible collection of resources to describe vessels, cruises, instruments, and datasets from the U.S. academic fleet, including data quality assessment results and clean trackline navigation; 2.) We facilitate data citation through the entire lifecycle from field acquisition to shoreside archiving to journal articles and global syntheses, by publishing Digital Object Identifiers (DOIs) for datasets and encoding them directly into our Linked Data resources; and 3.) We facilitate federation with other repositories such as the Biological and Chemical Oceanography Data Management Office (BCO-DMO), InterRidge Vents Database, and Index to Marine and Lacustrine Geological Samples (IMLGS), by reciprocal linking between RDF resources and supporting the RDF Query Language. R2R participates in the Ocean Data Interoperability Platform (ODIP), a joint European-U.S.-Australian partnership to facilitate the sharing of data and documentation across international borders. We publish our controlled vocabularies as a Simple Knowledge Organization System (SKOS) concept collection, and are working toward alignment with SeaDataNet and other community-standard terms using the NERC Vocabulary Server (NVS). http://rvdata.us/

  15. Articles of Data Confederation: DataONE, the KNB, and a multitude of Metacats- scaling interoperable data discovery and preservation from the lab to the Internet

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.; Jones, M. B.; Jones, C. S.; Tao, J.

    2017-12-01

    The opportunities for synthesis science to advance understanding of the environment have never been greater. Challenges remain, however, with regards to preserving data in discoverable and re-usable formats, to inform new integrative analyses, and support reproducible science. In this talk I will describe one promising solution for data preservation, discovery, and re-use- the Knowledge Network for Biocomplexity, or KNB. The KNB (http://knb.ecoinformatics.org) has been providing a reliable data repository for ecological and environmental researchers for over 15 years. The KNB is a distributed, open-source, web-enabled data repository based upon a formal metadata standard, EML, that is endorsed by several major ecological institutions including the LTER Network and NCEAS. A KNB server, also called a "Metacat", can be setup on very modest hardware, typically within a few hours, requres no expensive or proprietary software, and only moderate systems administration expertise. A tiered architecture allows KNB servers (or "Metacats") to communicate with other KNB servers, to afford greater operational reliability, higher performance, and reductions in potental data loss. The KNB is a strong member of the DataONE "Data Observation Network for Earth" (http://dataone.org) system, that confederates over 35 significant earth science data repositories (and still growing) from around the world through an open and powerful API. DataONE provides for integrated search over member repository holdings that incorporate features based on W3C-compliant semantics through annotations with OWL/RDF vocabularies such as PROV and the Environment Ontology, ENVO. The KNB and DataONE frameworks have given rise to an Open Science software development community that is actively building tools based on software that scientists already use, such as MATLAB and R. These tools can be used to both contribute data to, and operate upon data within the KNB and DataONE systems. An active User Community within DataONE assists with prioritizing future features of the framework, and provides for peer-to-peer assistance through chat-rooms and email lists. The challenge of achieving long-term sustainable funding for both the KNB and DataONE are still being addressed, and may stimulate discussion towards the end of my talk, time permitting.

  16. 36 CFR 1206.32 - What type of proposal is eligible for a records grant?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... history. (b) The Commission provides support to historical records repositories and other institutions for: (1) Advancing the state of the art in archival and records management and in the long-term...

  17. 36 CFR 1206.32 - What type of proposal is eligible for a records grant?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... history. (b) The Commission provides support to historical records repositories and other institutions for: (1) Advancing the state of the art in archival and records management and in the long-term...

  18. 36 CFR 1206.32 - What type of proposal is eligible for a records grant?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... history. (b) The Commission provides support to historical records repositories and other institutions for: (1) Advancing the state of the art in archival and records management and in the long-term...

  19. The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier

    NASA Technical Reports Server (NTRS)

    Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.

    2017-01-01

    The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.

  20. DataMed - an open source discovery index for finding biomedical datasets.

    PubMed

    Chen, Xiaoling; Gururaj, Anupama E; Ozyurt, Burak; Liu, Ruiling; Soysal, Ergin; Cohen, Trevor; Tiryaki, Firat; Li, Yueling; Zong, Nansu; Jiang, Min; Rogith, Deevakar; Salimi, Mandana; Kim, Hyeon-Eui; Rocca-Serra, Philippe; Gonzalez-Beltran, Alejandra; Farcas, Claudiu; Johnson, Todd; Margolis, Ron; Alter, George; Sansone, Susanna-Assunta; Fore, Ian M; Ohno-Machado, Lucila; Grethe, Jeffrey S; Xu, Hua

    2018-01-13

    Finding relevant datasets is important for promoting data reuse in the biomedical domain, but it is challenging given the volume and complexity of biomedical data. Here we describe the development of an open source biomedical data discovery system called DataMed, with the goal of promoting the building of additional data indexes in the biomedical domain. DataMed, which can efficiently index and search diverse types of biomedical datasets across repositories, is developed through the National Institutes of Health-funded biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium. It consists of 2 main components: (1) a data ingestion pipeline that collects and transforms original metadata information to a unified metadata model, called DatA Tag Suite (DATS), and (2) a search engine that finds relevant datasets based on user-entered queries. In addition to describing its architecture and techniques, we evaluated individual components within DataMed, including the accuracy of the ingestion pipeline, the prevalence of the DATS model across repositories, and the overall performance of the dataset retrieval engine. Our manual review shows that the ingestion pipeline could achieve an accuracy of 90% and core elements of DATS had varied frequency across repositories. On a manually curated benchmark dataset, the DataMed search engine achieved an inferred average precision of 0.2033 and a precision at 10 (P@10, the number of relevant results in the top 10 search results) of 0.6022, by implementing advanced natural language processing and terminology services. Currently, we have made the DataMed system publically available as an open source package for the biomedical community. © The Author 2018. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. ArrayWiki: an enabling technology for sharing public microarray data repositories and meta-analyses

    PubMed Central

    Stokes, Todd H; Torrance, JT; Li, Henry; Wang, May D

    2008-01-01

    Background A survey of microarray databases reveals that most of the repository contents and data models are heterogeneous (i.e., data obtained from different chip manufacturers), and that the repositories provide only basic biological keywords linking to PubMed. As a result, it is difficult to find datasets using research context or analysis parameters information beyond a few keywords. For example, to reduce the "curse-of-dimension" problem in microarray analysis, the number of samples is often increased by merging array data from different datasets. Knowing chip data parameters such as pre-processing steps (e.g., normalization, artefact removal, etc), and knowing any previous biological validation of the dataset is essential due to the heterogeneity of the data. However, most of the microarray repositories do not have meta-data information in the first place, and do not have a a mechanism to add or insert this information. Thus, there is a critical need to create "intelligent" microarray repositories that (1) enable update of meta-data with the raw array data, and (2) provide standardized archiving protocols to minimize bias from the raw data sources. Results To address the problems discussed, we have developed a community maintained system called ArrayWiki that unites disparate meta-data of microarray meta-experiments from multiple primary sources with four key features. First, ArrayWiki provides a user-friendly knowledge management interface in addition to a programmable interface using standards developed by Wikipedia. Second, ArrayWiki includes automated quality control processes (caCORRECT) and novel visualization methods (BioPNG, Gel Plots), which provide extra information about data quality unavailable in other microarray repositories. Third, it provides a user-curation capability through the familiar Wiki interface. Fourth, ArrayWiki provides users with simple text-based searches across all experiment meta-data, and exposes data to search engine crawlers (Semantic Agents) such as Google to further enhance data discovery. Conclusions Microarray data and meta information in ArrayWiki are distributed and visualized using a novel and compact data storage format, BioPNG. Also, they are open to the research community for curation, modification, and contribution. By making a small investment of time to learn the syntax and structure common to all sites running MediaWiki software, domain scientists and practioners can all contribute to make better use of microarray technologies in research and medical practices. ArrayWiki is available at . PMID:18541053

  2. Connecting the pieces: Using ORCIDs to improve research impact and repositories.

    PubMed

    Baessa, Mohamed; Lery, Thibaut; Grenz, Daryl; Vijayakumar, J K

    2015-01-01

    Quantitative data are crucial in the assessment of research impact in the academic world. However, as a young university created in 2009, King Abdullah University of Science and Technology (KAUST) needs to aggregate bibliometrics from researchers coming from diverse origins, not necessarily with the proper affiliations. In this context, the University has launched an institutional repository in September 2012 with the objectives of creating a home for the intellectual outputs of KAUST researchers. Later, the university adopted the first mandated institutional open access policy in the Arab region, effective June 31, 2014. Several projects were then initiated in order to accurately identify the research being done by KAUST authors and bring it into the repository in accordance with the open access policy. Integration with ORCID has been a key element in this process and the best way to ensure data quality for researcher's scientific contributions. It included the systematic inclusion and creation, if necessary, of ORCID identifiers in the existing repository system, an institutional membership in ORCID, and the creation of dedicated integration tools. In addition and in cooperation with the Office of Research Evaluation, the Library worked at implementing a Current Research Information System (CRIS) as a standardized common resource to monitor KAUST research outputs. We will present our findings about the CRIS implementation, the ORCID API, the repository statistics as well as our approach in conducting the assessment of research impact in terms of usage by the global research community.

  3. [Self-archiving of biomedical papers in open access repositories].

    PubMed

    Abad-García, M Francisca; Melero, Remedios; Abadal, Ernest; González-Teruel, Aurora

    2010-04-01

    Open-access literature is digital, online, free of charge, and free of most copyright and licensing restrictions. Self-archiving or deposit of scholarly outputs in institutional repositories (open-access green route) is increasingly present in the activities of the scientific community. Besides the benefits of open access for visibility and dissemination of science, it is increasingly more often required by funding agencies to deposit papers and any other type of documents in repositories. In the biomedical environment this is even more relevant by the impact scientific literature can have on public health. However, to make self-archiving feasible, authors should be aware of its meaning and the terms in which they are allowed to archive their works. In that sense, there are some tools like Sherpa/RoMEO or DULCINEA (both directories of copyright licences of scientific journals at different levels) to find out what rights are retained by authors when they publish a paper and if they allow to implement self-archiving. PubMed Central and its British and Canadian counterparts are the main thematic repositories for biomedical fields. In our country there is none of similar nature, but most of the universities and CSIC, have already created their own institutional repositories. The increase in visibility of research results and their impact on a greater and earlier citation is one of the most frequently advance of open access, but removal of economic barriers to access to information is also a benefit to break borders between groups.

  4. [Radiobiological Human Tissue repository: progress and perspectives for solving the problems of radiation safety and health protection of personnel and population].

    PubMed

    Kirillova, E N; Romanov, S A; Loffredo, C A; Zakharova, M L; Revina, V S; Sokolova, S N; Goerlitz, D S; Zubkova, O V; Lukianova, T V; Uriadnitzkaia, T I; Pavlova, O S; Slukinova, U V; Kolosova, A V; Muksinova, K N

    2014-01-01

    Radiobiological Human Tissue repository was established in order to obtain and store biological material from Mayak PA workers occupationally exposed to ionizing (α- and/or γ-) radiation in a wide dose range, from the residents exposed to long term radiation due to radiation accidents and transfer of the samples to scientists for the purpose of studying the effects of radiation for people and their offspring. The accumulated biomaterial is the informational and research potential that form the basis for the work of the scientists in different spheres of biology and medicine. The repository comprises 5 sections: tumor and non-tumor tissues obtained in the course of autopsies, biopsies, surgeries, samples of blood and its components, of DNA, induced sputum, saliva, and other from people exposed or unexposed (control) to radiation. The biomaterial is stored in formalin, in paraffin blocks, slides, as well as in the freezers under low temperatures. All the information on the samples and the registrants (medical, dosimetry, demographic, and occupational data) was obtained and entered into the electronic database. A constantly updated website of the repository was developed in order to provide a possibility to get acquainted with the material and proceed with application for biosamples for scientists from Russia and abroad. Some data obtained in the course of scientific research works on the basis of the biomaterial from the Repository are briefly introduced in the review.

  5. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    DTIC Science & Technology

    2016-05-09

    Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N

  6. SeeStar: an open-source, low-cost imaging system for subsea observations

    NASA Astrophysics Data System (ADS)

    Cazenave, F.; Kecy, C. D.; Haddock, S.

    2016-02-01

    Scientists and engineers at the Monterey Bay Aquarium Research Institute (MBARI) have collaborated to develop SeeStar, a modular, light weight, self-contained, low-cost subsea imaging system for short- to long-term monitoring of marine ecosystems. SeeStar is composed of separate camera, battery, and LED lighting modules. Two versions of the system exist: one rated to 300 meters depth, the other rated to 1500 meters. Users can download plans and instructions from an online repository and build the system using low-cost off-the-shelf components. The system utilizes an easily programmable Arduino based controller, and the widely distributed GoPro camera. The system can be deployed in a variety of scenarios taking still images and video and can be operated either autonomously or tethered on a range of platforms, including ROVs, AUVs, landers, piers, and moorings. Several Seestar systems have been built and used for scientific studies and engineering tests. The long-term goal of this project is to have a widely distributed marine imaging network across thousands of locations, to develop baselines of biological information.

  7. Establishment and Maintenance of Primary Fibroblast Repositories for Rare Diseases-Friedreich's Ataxia Example.

    PubMed

    Li, Yanjie; Polak, Urszula; Clark, Amanda D; Bhalla, Angela D; Chen, Yu-Yun; Li, Jixue; Farmer, Jennifer; Seyer, Lauren; Lynch, David; Butler, Jill S; Napierala, Marek

    2016-08-01

    Friedreich's ataxia (FRDA) represents a rare neurodegenerative disease caused by expansion of GAA trinucleotide repeats in the first intron of the FXN gene. The number of GAA repeats in FRDA patients varies from approximately 60 to <1000 and is tightly correlated with age of onset and severity of the disease symptoms. The heterogeneity of Friedreich's ataxia stresses the need for a large cohort of patient samples to conduct studies addressing the mechanism of disease pathogenesis or evaluate novel therapeutic candidates. Herein, we report the establishment and characterization of an FRDA fibroblast repository, which currently includes 50 primary cell lines derived from FRDA patients and seven lines from mutation carriers. These cells are also a source for generating induced pluripotent stem cell (iPSC) lines by reprogramming, as well as disease-relevant neuronal, cardiac, and pancreatic cells that can then be differentiated from the iPSCs. All FRDA and carrier lines are derived using a standard operating procedure and characterized to confirm mutation status, as well as expression of FXN mRNA and protein. Consideration and significance of creating disease-focused cell line and tissue repositories, especially in the context of rare and heterogeneous disorders, are presented. Although the economic aspect of creating and maintaining such repositories is important, the benefits of easy access to a collection of well-characterized cell lines for the purpose of drug discovery or disease mechanism studies overshadow the associated costs. Importantly, all FRDA fibroblast cell lines collected in our repository are available to the scientific community.

  8. Establishment and Maintenance of Primary Fibroblast Repositories for Rare Diseases—Friedreich's Ataxia Example

    PubMed Central

    Li, Yanjie; Polak, Urszula; Clark, Amanda D.; Bhalla, Angela D.; Chen, Yu-Yun; Li, Jixue; Farmer, Jennifer; Seyer, Lauren; Lynch, David

    2016-01-01

    Friedreich's ataxia (FRDA) represents a rare neurodegenerative disease caused by expansion of GAA trinucleotide repeats in the first intron of the FXN gene. The number of GAA repeats in FRDA patients varies from approximately 60 to <1000 and is tightly correlated with age of onset and severity of the disease symptoms. The heterogeneity of Friedreich's ataxia stresses the need for a large cohort of patient samples to conduct studies addressing the mechanism of disease pathogenesis or evaluate novel therapeutic candidates. Herein, we report the establishment and characterization of an FRDA fibroblast repository, which currently includes 50 primary cell lines derived from FRDA patients and seven lines from mutation carriers. These cells are also a source for generating induced pluripotent stem cell (iPSC) lines by reprogramming, as well as disease-relevant neuronal, cardiac, and pancreatic cells that can then be differentiated from the iPSCs. All FRDA and carrier lines are derived using a standard operating procedure and characterized to confirm mutation status, as well as expression of FXN mRNA and protein. Consideration and significance of creating disease-focused cell line and tissue repositories, especially in the context of rare and heterogeneous disorders, are presented. Although the economic aspect of creating and maintaining such repositories is important, the benefits of easy access to a collection of well-characterized cell lines for the purpose of drug discovery or disease mechanism studies overshadow the associated costs. Importantly, all FRDA fibroblast cell lines collected in our repository are available to the scientific community. PMID:27002638

  9. EPA’s SPECIATE 4.4 Database - Development and Uses

    EPA Science Inventory

    SPECIATE is the EPA's repository of TOG, PM, and Other Gases speciation profiles of air pollution sources. It includes weight fractions of both organic species and PM and provides data in consistent units. Species include metals, ions, elements, and organic and inorganic compound...

  10. Scrubchem: Building Bioactivity Datasets from Pubchem Bioassay Data (SOT)

    EPA Science Inventory

    The PubChem Bioassay database is a non-curated public repository with data from 64 sources, including: ChEMBL, BindingDb, DrugBank, EPA Tox21, NIH Molecular Libraries Screening Program, and various other academic, government, and industrial contributors. Methods for extracting th...

  11. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  12. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  13. HEPData: a repository for high energy physics data

    NASA Astrophysics Data System (ADS)

    Maguire, Eamonn; Heinrich, Lukas; Watt, Graeme

    2017-10-01

    The Durham High Energy Physics Database (HEPData) has been built up over the past four decades as a unique open-access repository for scattering data from experimental particle physics papers. It comprises data points underlying several thousand publications. Over the last two years, the HEPData software has been completely rewritten using modern computing technologies as an overlay on the Invenio v3 digital library framework. The software is open source with the new site available at https://hepdata.net now replacing the previous site at http://hepdata.cedar.ac.uk. In this write-up, we describe the development of the new site and explain some of the advantages it offers over the previous platform.

  14. SchizConnect: Mediating Neuroimaging Databases on Schizophrenia and Related Disorders for Large-Scale Integration

    PubMed Central

    Wang, Lei; Alpert, Kathryn I.; Calhoun, Vince D.; Cobia, Derin J.; Keator, David B.; King, Margaret D.; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D.; Potkin, Steven G.; Turner, Jessica A.; Ambite, Jose Luis

    2015-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation—translating across data sources—so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information are being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. PMID:26142271

  15. A large-scale solar dynamics observatory image dataset for computer vision applications.

    PubMed

    Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A

    2017-01-01

    The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.

  16. Optimal operation management of fuel cell/wind/photovoltaic power sources connected to distribution networks

    NASA Astrophysics Data System (ADS)

    Niknam, Taher; Kavousifard, Abdollah; Tabatabaei, Sajad; Aghaei, Jamshid

    2011-10-01

    In this paper a new multiobjective modified honey bee mating optimization (MHBMO) algorithm is presented to investigate the distribution feeder reconfiguration (DFR) problem considering renewable energy sources (RESs) (photovoltaics, fuel cell and wind energy) connected to the distribution network. The objective functions of the problem to be minimized are the electrical active power losses, the voltage deviations, the total electrical energy costs and the total emissions of RESs and substations. During the optimization process, the proposed algorithm finds a set of non-dominated (Pareto) optimal solutions which are stored in an external memory called repository. Since the objective functions investigated are not the same, a fuzzy clustering algorithm is utilized to handle the size of the repository in the specified limits. Moreover, a fuzzy-based decision maker is adopted to select the 'best' compromised solution among the non-dominated optimal solutions of multiobjective optimization problem. In order to see the feasibility and effectiveness of the proposed algorithm, two standard distribution test systems are used as case studies.

  17. 36 CFR § 1206.32 - What type of proposal is eligible for a records grant?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... history. (b) The Commission provides support to historical records repositories and other institutions for: (1) Advancing the state of the art in archival and records management and in the long-term...

  18. Natural analogues for processes affecting disposal of high-level radioactive waste in the vadose zone

    NASA Astrophysics Data System (ADS)

    Stuckless, J. S.

    2003-04-01

    Natural analogues can contribute to understanding and predicting the performance of subsystems and processes affecting a mined geologic repository for high-level radioactive waste in several ways. Most importantly, analogues provide tests for various aspects of systems of a repository at dimensional scales and time spans that cannot be attained by experimental study. In addition, they provide a means for the general public to judge the predicted performance of a potential high-level nuclear waste repository in familiar terms such that the average person can assess the anticipated long-term performance and other scientific conclusions. Hydrologists working on the Yucca Mountain Project (currently the U.S. Department of Energy's Office of Repository Development) have modeled the flow of water through the vadose zone at Yucca Mountain, Nevada and particularly the interaction of vadose-zone water with mined openings. Analogues from both natural and anthropogenic examples confirm the prediction that most of the water moving through the vadose zone will move through the host rock and around tunnels. This can be seen both quantitatively where direct comparison between seepage and net infiltration has been made and qualitatively by the excellent degree of preservation of archaeologic artifacts in underground openings. The latter include Paleolithic cave paintings in southwestern Europe, murals and artifacts in Egyptian tombs, painted subterranean Buddhist temples in India and China, and painted underground churches in Cappadocia, Turkey. Natural analogues also suggest that this diversion mechanism is more effective in porous media than in fractured media. Observations from natural analogues are also consistent with the modeled decrease in the percentage of infiltration that becomes seepage with a decrease in amount of infiltration. Finally, analogues, such as tombs that have ben partially filled by mud flows, suggest that the same capillary forces that keep water in the rock around underground openings will draw water towards buried waste packages if they are encased in backfill. Analogue work in support of the U.S. repository program continues in the U.S. Geological Survey, in cooperation with the U.S. Department of Energy.

  19. Master Metadata Repository and Metadata-Management System

    NASA Technical Reports Server (NTRS)

    Armstrong, Edward; Reed, Nate; Zhang, Wen

    2007-01-01

    A master metadata repository (MMR) software system manages the storage and searching of metadata pertaining to data from national and international satellite sources of the Global Ocean Data Assimilation Experiment (GODAE) High Resolution Sea Surface Temperature Pilot Project [GHRSSTPP]. These sources produce a total of hundreds of data files daily, each file classified as one of more than ten data products representing global sea-surface temperatures. The MMR is a relational database wherein the metadata are divided into granulelevel records [denoted file records (FRs)] for individual satellite files and collection-level records [denoted data set descriptions (DSDs)] that describe metadata common to all the files from a specific data product. FRs and DSDs adhere to the NASA Directory Interchange Format (DIF). The FRs and DSDs are contained in separate subdatabases linked by a common field. The MMR is configured in MySQL database software with custom Practical Extraction and Reporting Language (PERL) programs to validate and ingest the metadata records. The database contents are converted into the Federal Geographic Data Committee (FGDC) standard format by use of the Extensible Markup Language (XML). A Web interface enables users to search for availability of data from all sources.

  20. The igmspec database of public spectra probing the intergalactic medium

    NASA Astrophysics Data System (ADS)

    Prochaska, J. X.

    2017-04-01

    We describe v02 of igmspec, a database of publicly available ultraviolet, optical, and near-infrared spectra that probe the intergalactic medium (IGM). This database, a child of the specdb repository in the specdb github organization, comprises 403 277 unique sources and 434 686 spectra obtained with the world's greatest observatories. All of these data are distributed in a single ≈ 25GB HDF5 file maintained at the University of California Observatories and the University of California, Santa Cruz. The specdb software package includes Python scripts and modules for searching the source catalog and spectral datasets, and software links to the linetools package for spectral analysis. The repository also includes software to generate private spectral datasets that are compliant with International Virtual Observatory Alliance (IVOA) protocols and a Python-based interface for IVOA Simple Spectral Access queries. Future versions of igmspec will ingest other sources (e.g. gamma-ray burst afterglows) and other surveys as they become publicly available. The overall goal is to include every spectrum that effectively probes the IGM. Future databases of specdb may include publicly available galaxy spectra (exgalspec) and published supernovae spectra (snspec). The community is encouraged to join the effort on github: https://github.com/specdb.

  1. The Astrophysics Source Code Library: Where Do We Go from Here?

    NASA Astrophysics Data System (ADS)

    Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J.

    2014-05-01

    The Astrophysics Source Code Library1, started in 1999, has in the past three years grown from a repository for 40 codes to a registry of over 700 codes that are now indexed by ADS. What comes next? We examine the future of the , the challenges facing it, the rationale behind its practices, and the need to balance what we might do with what we have the resources to accomplish.

  2. EPA’s SPECIATE 4.4 Database:Development and Uses

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA)repository of volatile organic gas and particulate matter (PM) speciation profiles for air pollution sources. EPA released SPECIATE 4.4 in early 2014 and, in total, the SPECIATE 4.4 database includes 5,728 PM, VOC, total...

  3. EPA’s SPECIATE 4.4 Database: Development and Uses

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency's (EPA)repository of volatile organic gas and particulate matter (PM) speciation profiles for air pollution sources. EPA released SPECIATE 4.4 in early 2014 and, in total, the SPECIATE 4.4 database includes 5,728 PM, VOC, total...

  4. 17 CFR 45.6 - Legal entity identifiers

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... applied to swap data repositories by part 49 of this chapter. (4) Open Source. The schema for the legal... Section 45.6 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA... to the jurisdiction of the Commission shall be identified in all recordkeeping and all swap data...

  5. 17 CFR 45.6 - Legal entity identifiers

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... applied to swap data repositories by part 49 of this chapter. (4) Open Source. The schema for the legal... Section 45.6 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA... to the jurisdiction of the Commission shall be identified in all recordkeeping and all swap data...

  6. ACToR: Aggregated Computational Toxicology Resource (T)

    EPA Science Inventory

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information ...

  7. 10 CFR 60.3 - License required.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false License required. 60.3 Section 60.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General Provisions § 60.3 License required. (a) DOE shall not receive or possess source, special nuclear, or...

  8. 10 CFR 60.3 - License required.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false License required. 60.3 Section 60.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General Provisions § 60.3 License required. (a) DOE shall not receive or possess source, special nuclear, or...

  9. 10 CFR 60.3 - License required.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false License required. 60.3 Section 60.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General Provisions § 60.3 License required. (a) DOE shall not receive or possess source, special nuclear, or...

  10. SPECIATE 4.0: SPECIATION DATABASE DEVELOPMENT DOCUMENTATION--FINAL REPORT

    EPA Science Inventory

    SPECIATE is the U.S. EPA's repository of total organic compounds (TOC) and particulate matter (PM) speciation profiles of air pollution sources. This report documents how EPA developed the SPECIATE 4.0 database that replaces the prior version, SPECIATE 3.2. SPECIATE 4.0 includes ...

  11. Use of groundwater lifetime expectancy for the performance assessment of a deep geologic waste repository: 1. Theory, illustrations, and implications

    NASA Astrophysics Data System (ADS)

    Cornaton, F. J.; Park, Y.-J.; Normani, S. D.; Sudicky, E. A.; Sykes, J. F.

    2008-04-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, if radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from a repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time that radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport adjoint equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. The risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The utility of the method is illustrated by means of analytical and numerical examples, which focus on the effect of fracture networks on the uncertainty of evaluated lifetime expectancy.

  12. Potential impact of Andrassy bentonite microbial diversity in the long-term performance of a deep nuclear waste repository

    NASA Astrophysics Data System (ADS)

    Tadza, M. Y. Mohd; Tadza, M. A. Mohd; Bag, R.; Harith, N. S. H.

    2018-01-01

    Copper and steel canning and bentonite buffer are normally forseen as the primary containment component of a deep nuclear waste repository. Distribution of microbes in subsurface environments have been found to be extensive and directly or indirectly may exert influence on waste canister corrosion and the mobility of radionuclides. The understanding of clays and microbial interaction with radionuclides will be useful in predicting the microbial impacts on the performance of the waste repositories. The present work characterizes the culture-dependent microbial diversity of Andrassy bentonite recovered from Tawau clay deposits. The evaluation of microbial populations shows the presence of a number of cultivable microbes (e.g. Staphylococcus, Micrococcus, Achromobacter, Bacillus, Paecilomyces, Trichoderma, and Fusarium). Additionally, a pigmented yeast strain Rhodotorula mucilaginosa was also recovered from the formation. Both Bacillus and Rhodotorula mucilaginosa have high tolerance towards U radiation and toxicity. The presence of Rhodotorula mucilaginosa in Andrassy bentonite might be able to change the speciation of radionuclides (e.g. uranium) in a future deep repository. However, concern over the presence of Fe (III) reduction microbes such as Bacillus also found in the formation could lead to corrosion of copper steel canister and affect the overall performance of the containment system.

  13. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  14. Rolling Deck to Repository (R2R): Supporting Global Data Access Through the Ocean Data Interoperability Platform (ODIP)

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Stocks, K.; Chandler, C. L.; Smith, S. R.; Miller, S. P.; Maffei, A. R.; Glaves, H. M.; Carbotte, S. M.

    2013-12-01

    The U.S. National Science Foundation supports a fleet of academic research vessels operating throughout the world's oceans. In addition to supporting the mission-specific goals of each expedition, these vessels routinely deploy a suite of underway environmental sensors, operating like mobile observatories. Recognizing that the data from these instruments have value beyond each cruise, NSF funded R2R in 2009 to ensure that these data are routinely captured, cataloged and described, and submitted to the appropriate national repository for long-term public access. In 2013, R2R joined the Ocean Data Interoperability Platform (ODIP; http://odip.org/). The goal of ODIP is to remove barriers to the effective sharing of data across scientific domains and international boundaries, by providing a forum to harmonize diverse regional systems. To advance this goal, ODIP organizes international workshops to foster the development of common standards and develop prototypes to evaluate and test potential standards and interoperability solutions. ODIP includes major organizations engaged in ocean data stewardship in the EU, US, and Australia, supported by the International Oceanographic Data and Information Exchange (IODE). Within the broad scope of ODIP, R2R focuses on contributions in 4 key areas: ● Implement a 'Linked Open Data' approach to disseminate data and documentation, using existing World Wide Web Consortium (W3C) specifications and machine-readable formats. Exposing content as Linked Open Data will provide a simple mechanism for ODIP collaborators to browse and compare data sets among repositories. ● Map key vocabularies used by R2R to their European and Australian counterparts. The existing heterogeneity among terms inhibits data discoverability, as a user searching on the term with which s/he is familiar may not find all data of interest. Mapping key terms across the different ODIP partners, relying on the backbone thesaurus provided by the NERC Vocabulary Server (http://vocab.nerc.ac.uk/), is a first step towards wider data discoverability. ● Upgrade existing R2R ISO metadata records to be compatible with the new SeaDataNet II Cruise Summary Report (CSR) profile, and publish the records in a standards-compliant Web portal, built on the GeoNetwork open-source package. ● Develop the future workforce. R2R is enlisting and exposing a group of five students to new informatics technologies and international collaboration. Students are undertaking a coordinated series of projects in 2013 and 2014 at each of the R2R partner institutions, combined with travel to selected meetings where they will engage in the ODIP Workshop process; present results; and exchange ideas with working scientists and software developers in Europe and Australia. Students work closely with staff at the R2R partner institutions, in projects that build on the R2R-ODIP technical work components described above.

  15. Accessing and Integrating Data and Knowledge for Biomedical Research

    PubMed Central

    Burgun, A.; Bodenreider, O.

    2008-01-01

    Summary Objectives To review the issues that have arisen with the advent of translational research in terms of integration of data and knowledge, and survey current efforts to address these issues. Methods Using examples form the biomedical literature, we identified new trends in biomedical research and their impact on bioinformatics. We analyzed the requirements for effective knowledge repositories and studied issues in the integration of biomedical knowledge. Results New diagnostic and therapeutic approaches based on gene expression patterns have brought about new issues in the statistical analysis of data, and new workflows are needed are needed to support translational research. Interoperable data repositories based on standard annotations, infrastructures and services are needed to support the pooling and meta-analysis of data, as well as their comparison to earlier experiments. High-quality, integrated ontologies and knowledge bases serve as a source of prior knowledge used in combination with traditional data mining techniques and contribute to the development of more effective data analysis strategies. Conclusion As biomedical research evolves from traditional clinical and biological investigations towards omics sciences and translational research, specific needs have emerged, including integrating data collected in research studies with patient clinical data, linking omics knowledge with medical knowledge, modeling the molecular basis of diseases, and developing tools that support in-depth analysis of research data. As such, translational research illustrates the need to bridge the gap between bioinformatics and medical informatics, and opens new avenues for biomedical informatics research. PMID:18660883

  16. Is the National Guideline Clearinghouse a Trustworthy Source of Practice Guidelines for Child and Youth Anxiety and Depression?

    PubMed

    Duda, Stephanie; Fahim, Christine; Szatmari, Peter; Bennett, Kathryn

    2017-07-01

    Innovative strategies that facilitate the use of high quality practice guidelines (PG) are needed. Accordingly, repositories designed to simplify access to PGs have been proposed as a critical component of the network of linked interventions needed to drive increased PG implementation. The National Guideline Clearinghouse (NGC) is a free, international online repository. We investigated whether it is a trustworthy source of child and youth anxiety and depression PGs. English language PGs published between January 2009 and February 2016 relevant to anxiety or depression in children and adolescents (≤ 18 years of age) were eligible. Two trained raters assessed PG quality using Appraisal of Guidelines for Research and Evaluation (AGREE II). Scores on at least three AGREE II domains (stakeholder involvement, rigor of development, and editorial independence) were used to designate PGs as: i) minimum quality (≥ 50%); and ii) high quality (≥ 70%). Eight eligible PGs were identified (depression, n=6; anxiety and depression, n=1; social anxiety disorder, n=1). Four of eight PGs met minimum quality criteria; three of four met high quality criteria. At present, NGC users without the time and special skills required to evaluate PG quality may unknowingly choose flawed PGs to guide decisions about child and youth anxiety and depression. The recent NGC decision to explore the inclusion of PG quality profiles based on Institute of Medicine standards provides needed leadership that can strengthen PG repositories, prevent harm and wasted resources, and build PG developer capacity.

  17. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  18. Generic Argillite/Shale Disposal Reference Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Liange; Colon, Carlos Jové; Bianchi, Marco

    Radioactive waste disposal in a deep subsurface repository hosted in clay/shale/argillite is a subject of widespread interest given the desirable isolation properties, geochemically reduced conditions, and widespread geologic occurrence of this rock type (Hansen 2010; Bianchi et al. 2013). Bianchi et al. (2013) provides a description of diffusion in a clay-hosted repository based on single-phase flow and full saturation using parametric data from documented studies in Europe (e.g., ANDRA 2005). The predominance of diffusive transport and sorption phenomena in this clay media are key attributes to impede radionuclide mobility making clay rock formations target sites for disposal of high-level radioactivemore » waste. The reports by Hansen et al. (2010) and those from numerous studies in clay-hosted underground research laboratories (URLs) in Belgium, France and Switzerland outline the extensive scientific knowledge obtained to assess long-term clay/shale/argillite repository isolation performance of nuclear waste. In the past several years under the UFDC, various kinds of models have been developed for argillite repository to demonstrate the model capability, understand the spatial and temporal alteration of the repository, and evaluate different scenarios. These models include the coupled Thermal-Hydrological-Mechanical (THM) and Thermal-Hydrological-Mechanical-Chemical (THMC) models (e.g. Liu et al. 2013; Rutqvist et al. 2014a, Zheng et al. 2014a) that focus on THMC processes in the Engineered Barrier System (EBS) bentonite and argillite host hock, the large scale hydrogeologic model (Bianchi et al. 2014) that investigates the hydraulic connection between an emplacement drift and surrounding hydrogeological units, and Disposal Systems Evaluation Framework (DSEF) models (Greenberg et al. 2013) that evaluate thermal evolution in the host rock approximated as a thermal conduction process to facilitate the analysis of design options. However, the assumptions and the properties (parameters) used in these models are different, which not only make inter-model comparisons difficult, but also compromise the applicability of the lessons learned from one model to another model. The establishment of a reference case would therefore be helpful to set up a baseline for model development. A generic salt repository reference case was developed in Freeze et al. (2013) and the generic argillite repository reference case is presented in this report. The definition of a reference case requires the characterization of the waste inventory, waste form, waste package, repository layout, EBS backfill, host rock, and biosphere. This report mainly documents the processes in EBS bentonite and host rock that are potentially important for performance assessment and properties that are needed to describe these processes, with brief description other components such as waste inventory, waste form, waste package, repository layout, aquifer, and biosphere. A thorough description of the generic argillite repository reference case will be given in Jové Colon et al. (2014).« less

  19. Moderate-temperature zeolitic alteration in a cooling pyroclastic deposit

    USGS Publications Warehouse

    Levy, S.S.; O'Neil, J.R.

    1989-01-01

    The locally zeolitized Topopah Spring Member of the Paintbrush Tuff (13 Myr.), Yucca Mountain, Nevada, U.S.A., is part of a thick sequence of zeolitized pyroclastic units. Most of the zeolitized units are nonwelded tuffs that were altered during low-temperature diagenesis, but the distribution and textural setting of zeolite (heulandite-clinoptilolite) and smectite in the densely welded Topopah Spring tuff suggest that these hydrous minerals formed while the tuff was still cooling after pyroclastic emplacement and welding. The hydrous minerals are concentrated within a transition zone between devitrified tuff in the central part of the unit and underlying vitrophyre. Movement of liquid and convected heat along fractures from the devitrified tuff to the ritrophyre caused local devitrification and hydrous mineral crystallization. Oxygen isotope geothermometry of cogenetic quartz confirms the nondiagenetic moderate temperature origin of the hydrous minerals at temperatures of ??? 40-100??C, assuming a meteoric water source. The Topopah Spring tuff is under consideration for emplacement of a high-level nuclear waste repository. The natural rock alteration of the cooling pyroclastic deposit may be a good natural analog for repository-induced hydrothermal alteration. As a result of repository thermal loading, temperatures in the Topopah Spring vitrophyre may rise sufficiently to duplicate the inferred temperatures of natural zeolitic alteration. Heated water moving downward from the repository into the vitrophyre may contribute to new zeolitic alteration. ?? 1989.

  20. Deep Boreholes Seals Subjected to High P,T conditions - Proposed Experimental Studies

    NASA Astrophysics Data System (ADS)

    Caporuscio, F.

    2015-12-01

    Deep borehole experimental work will constrain the P,T conditions which "seal" material will experience in deep borehole crystalline rock repositories. The rocks of interest to this study include mafic (amphibolites) and silicic (granitic gneiss) end members. The experiments will systematically add components to capture discrete changes in both water and EBS component chemistries. Experiments in the system wall rock-clay-concrete-groundwater will evaluate interactions among components, including: mineral phase stability, metal corrosion rates and thermal limits. Based on engineered barrier studies, experimental investigations will move forward with three focusses. First, evaluation of interaction between "seal" materials and repository wall rock (crystalline) under fluid-saturated conditions over long-term (i.e., six-month) experiments; which reproduces the thermal pulse event of a repository. Second, perform experiments to determine the stability of zeolite minerals (analcime-wairakitess) under repository conditions. Both sets of experiments are critically important for understanding mineral paragenesis (zeolites and/or clay transformations) associated with "seals" in contact with wall rock at elevated temperatures. Third, mineral growth at the metal interface is a principal control on the survivability (i.e. corrosion) of waste canisters in a repository. The objective of this planned experimental work is to evaluate physio-chemical processes for 'seal' components and materials relevant to deep borehole disposal. These evaluations will encompass multi-laboratory efforts for the development of seals concepts and application of Thermal-Mechanical-Chemical (TMC) modeling work to assess barrier material interactions with subsurface fluids and other barrier materials, their stability at high temperatures, and the implications of these processes to the evaluation of thermal limits.

  1. An Investigation of Generic Structures of Pakistani Doctoral Thesis Acknowledgements

    ERIC Educational Resources Information Center

    Rofess, Sakander; Mahmood, Muhammad Asim

    2015-01-01

    This paper investigates Pakistani doctoral thesis acknowledgements from genre analysis perspective. A corpus of 235 PhD thesis acknowledgements written in English was taken from Pakistani doctoral theses collected from eight different disciplines. HEC Research Repository of Pakistan was used as a data sources. The theses written by Pakistani…

  2. US EPA's SPECIATE 4.4 Database: Development and Uses

    EPA Science Inventory

    SPECIATE is the U.S. Environmental Protection Agency’s (EPA) repository of volatile organic gas and particulate matter (PM) speciation profiles of air pollution sources. EPA released SPECIATE 4.4 in early 2014 and, in total, the SPECIATE 4.4 database includes 5,728 PM, volatile o...

  3. 36 CFR 1290.3 - Sources of assassination records and additional records and information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... service with a government agency, office, or entity; (f) Persons, including individuals and corporations... Government; (b) Agencies, offices, and entities of the executive, legislative, and judicial branches of state and local governments; (c) Record repositories and archives of Federal, state, and local governments...

  4. The Amistad Research Center: Documenting the African American Experience.

    ERIC Educational Resources Information Center

    Chepesiuk, Ron

    1993-01-01

    Describes the Amistad Research Center housed at Tulane University which is a repository of primary documents on African-American history. Topics addressed include the development and growth of the collection; inclusion of the American Missionary Association archives; sources of support; civil rights; and collecting for the future. (LRW)

  5. AN OPEN-SOURCE COMMUNITY WEB SITE TO SUPPORT GROUND-WATER MODEL TESTING

    EPA Science Inventory

    A community wiki wiki web site has been created as a resource to support ground-water model development and testing. The Groundwater Gourmet wiki is a repository for user supplied analytical and numerical recipes, how-to's, and examples. Members are encouraged to submit analyti...

  6. OER Use in Intermediate Language Instruction: A Case Study

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    This paper reports on a case study in the experimental use of Open Educational Resources (OERs) in intermediate level language instruction. The resources come from three sources: the instructor, the students, and open content repositories. The objective of this action research project was to provide student-centered learning materials, enhance…

  7. Intelligent resource discovery using ontology-based resource profiles

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Kelly, Sean; Crichton, Jerry; Tran, Thuy

    2004-01-01

    Successful resource discovery across heterogeneous repositories is strongly dependent on the semantic and syntactic homogeneity of the associated resource descriptions. Ideally, resource descriptions are easily extracted from pre-existing standardized sources, expressed using standard syntactic and semantic structures, and managed and accessed within a distributed, flexible, and scaleable software framework.

  8. Resistance to phytophthora and graft compatibility with persian walnut among seedlings of chinese wingnut from different sources

    USDA-ARS?s Scientific Manuscript database

    Seedlings from seven open-pollinated selections of Chinese wingnut (Pterocarya stenoptera) (WN) representing collections of the USDA-ARS National Clonal Germplasm Repository at Davis CA and the University of California at Davis were evaluated as rootstocks for resistance to Phytophthora cinnamomi an...

  9. Tank 241-AY-102 Leak Assessment Supporting Documentation: Miscellaneous Reports, Letters, Memoranda, And Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engeman, J. K.; Girardot, C. L.; Harlow, D. G.

    2012-12-20

    This report contains reference materials cited in RPP-ASMT -53793, Tank 241-AY-102 Leak Assessment Report, that were obtained from the National Archives Federal Records Repository in Seattle, Washington, or from other sources including the Hanford Site's Integrated Data Management System database (IDMS).

  10. DISTRIBUTION OF PESTICIDES AND POLYCYCLIC AROMATIC HYDROCARBONS IN HOUSE DUST AS A FUNCTION OF PARTICLE SIZE

    EPA Science Inventory

    House dust is a repository for environmental pollutants that may accumulate indoors from both internal and external sources over long periods of time. Dust and tracked-in soil accumulate most efficiently in carpets, and the pollutants associated with it may present an exposure...

  11. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories

    PubMed Central

    Neu, Scott C.; Crawford, Karen L.; Toga, Arthur W.

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead. PMID:22470336

  12. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories.

    PubMed

    Neu, Scott C; Crawford, Karen L; Toga, Arthur W

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead.

  13. Bayesian approach to transforming public gene expression repositories into disease diagnosis databases.

    PubMed

    Huang, Haiyan; Liu, Chun-Chi; Zhou, Xianghong Jasmine

    2010-04-13

    The rapid accumulation of gene expression data has offered unprecedented opportunities to study human diseases. The National Center for Biotechnology Information Gene Expression Omnibus is currently the largest database that systematically documents the genome-wide molecular basis of diseases. However, thus far, this resource has been far from fully utilized. This paper describes the first study to transform public gene expression repositories into an automated disease diagnosis database. Particularly, we have developed a systematic framework, including a two-stage Bayesian learning approach, to achieve the diagnosis of one or multiple diseases for a query expression profile along a hierarchical disease taxonomy. Our approach, including standardizing cross-platform gene expression data and heterogeneous disease annotations, allows analyzing both sources of information in a unified probabilistic system. A high level of overall diagnostic accuracy was shown by cross validation. It was also demonstrated that the power of our method can increase significantly with the continued growth of public gene expression repositories. Finally, we showed how our disease diagnosis system can be used to characterize complex phenotypes and to construct a disease-drug connectivity map.

  14. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards § 63.304... uncertainties in making long-term projections of the performance of the Yucca Mountain disposal system; (3) Does... the full range of defensible and reasonable parameter distributions rather than only upon extreme...

  15. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards § 63.304... uncertainties in making long-term projections of the performance of the Yucca Mountain disposal system; (3) Does... the full range of defensible and reasonable parameter distributions rather than only upon extreme...

  16. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards § 63.304... uncertainties in making long-term projections of the performance of the Yucca Mountain disposal system; (3) Does... the full range of defensible and reasonable parameter distributions rather than only upon extreme...

  17. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards § 63.304... uncertainties in making long-term projections of the performance of the Yucca Mountain disposal system; (3) Does... the full range of defensible and reasonable parameter distributions rather than only upon extreme...

  18. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards § 63.304... uncertainties in making long-term projections of the performance of the Yucca Mountain disposal system; (3) Does... the full range of defensible and reasonable parameter distributions rather than only upon extreme...

  19. The SpeX Prism Library for Ultracool Dwarfs: A Resource for Stellar, Exoplanet and Galactic Science and Student-Led Research

    NASA Astrophysics Data System (ADS)

    Burgasser, Adam

    The NASA Infrared Telescope Facility's (IRTF) SpeX spectrograph has been an essential tool in the discovery and characterization of ultracool dwarf (UCD) stars, brown dwarfs and exoplanets. Over ten years of SpeX data have been collected on these sources, and a repository of low-resolution (R 100) SpeX prism spectra has been maintained by the PI at the SpeX Prism Spectral Libraries website since 2008. As the largest existing collection of NIR UCD spectra, this repository has facilitated a broad range of investigations in UCD, exoplanet, Galactic and extragalactic science, contributing to over 100 publications in the past 6 years. However, this repository remains highly incomplete, has not been uniformly calibrated, lacks sufficient contextual data for observations and sources, and most importantly provides no data visualization or analysis tools for the user. To fully realize the scientific potential of these data for community research, we propose a two-year program to (1) calibrate and expand existing repository and archival data, and make it virtual-observatory compliant; (2) serve the data through a searchable web archive with basic visualization tools; and (3) develop and distribute an open-source, Python-based analysis toolkit for users to analyze the data. These resources will be generated through an innovative, student-centered research model, with undergraduate and graduate students building and validating the analysis tools through carefully designed coding challenges and research validation activities. The resulting data archive, the SpeX Prism Library, will be a legacy resource for IRTF and SpeX, and will facilitate numerous investigations using current and future NASA capabilities. These include deep/wide surveys of UCDs to measure Galactic structure and chemical evolution, and probe UCD populations in satellite galaxies (e.g., JWST, WFIRST); characterization of directly imaged exoplanet spectra (e.g., FINESSE), and development of low-temperature theoretical models of UCD and exoplanet atmospheres. Our program will also serve to validate the IRTF data archive during its development, by reducing and disseminating non-proprietary archival observations of UCDs to the community. The proposed program directly addresses NASA's strategic goals of exploring the origin and evolution of stars and planets that make up our universe, and discovering and studying planets around other stars.

  20. The microbiology of the Maqarin site, Jordan -- A natural analogue for cementitious radioactive waste repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, J.M.; Coombs, P.; Gardner, S.J.

    1995-12-31

    The Maqarin site, Jordan is being studied as a natural analogue of a cementitious radioactive waste repository. The microbiology has been studied and diverse microbial populations capable of tolerating alkaline pH were detected at all sampling localities. Dissolved organic carbon was identified as the potentially most important reductant with sulfate identified as the main oxidant, both supply energy for microbial life. Calculations on upper limits of microbial numbers were made with a microbiology code (MGSE) using existing information but the results are overestimates when compared with field observations. This indicates that the model is very conservative and that more informationmore » on, for example, carbon sources is required.« less

  1. Data Preservation, Information Preservation, and Lifecyle of Information Management at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Khayat, Mo; Kempler, Steve; Deshong, Barbara; Johnson, James; Gerasimov, Irina; Esfandiari, Ed; Berganski, Michael; Wei, Jennifer

    2014-01-01

    Data lifecycle management awareness is common today; planners are more likely to consider lifecycle issues at mission start. NASA remote sensing missions are typically subject to life cycle management plans of the Distributed Active Archive Center (DAAC), and NASA invests in these national centers for the long-term safeguarding and benefit of future generations. As stewards of older missions, it is incumbent upon us to ensure that a comprehensive enough set of information is being preserved to prevent the risk for information loss. This risk is greater when the original data experts have moved on or are no longer available. Preservation of items like documentation related to processing algorithms, pre-flight calibration data, or input-output configuration parameters used in product generation, are examples of digital artifacts that are sometimes not fully preserved. This is the grey area of information preservation; the importance of these items is not always clear and requires careful consideration. Missing important metadata about intermediate steps used to derive a product could lead to serious challenges in the reproducibility of results or conclusions. Organizations are rapidly recognizing that the focus of life-cycle preservation needs to be enlarged from the strict raw data to the more encompassing arena of information lifecycle management. By understanding what constitutes information, and the complexities involved, we are better equipped to deliver longer lasting value about the original data and derived knowledge (information) from them. The NASA Earth Science Data Preservation Content Specification is an attempt to define the content necessary for long-term preservation. It requires new lifecycle infrastructure approach along with content repositories to accommodate artifacts other than just raw data. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) setup an open-source Preservation System capable of long-term archive of digital content to augment its raw data holding. This repository is being used for such missions as HIRDLS, UARS, TOMS, OMI, among others. We will provide a status of this implementation; report on challenges, lessons learned, and detail our plans for future evolution to include other missions and services.

  2. Data Preservation, Information Preservation, and life-cyle of information management at NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Khayat, M. G.; Deshong, B.; Esfandiari, A. E.; Gerasimov, I. V.; Johnson, J. E.; Kempler, S. J.; Wei, J. C.

    2014-12-01

    Data lifecycle management awareness is common today; planners are more likely to consider lifecycle issues at mission start. NASA remote sensing missions are typically subject to life cycle management plans of the Distributed Active Archive Center (DAAC), and NASA invests in these national centers for the long-term safeguarding and benefit of future generations. As stewards of older missions, it is incumbent upon us to ensure that a comprehensive enough set of information is being preserved to prevent the risk for "information loss". This risk is greater when the original data experts have moved on or are no longer available. Preservation of items like documentation related to processing algorithms, pre-flight calibration data, or input/output configuration parameters used in product generation, are examples of digital artifacts that are sometimes not fully preserved. This is the grey area of "information preservation"; the importance of these items is not always clear and requires careful consideration. Missing important "metadata" about intermediate steps used to derive a product could lead to serious challenges in the reproducibility of results or conclusions.Organizations are rapidly recognizing that the focus of life-cycle preservation needs to be enlarged from the strict raw data to the more encompassing arena of "information lifecycle management". By understanding what constitutes information, and the complexities involved, we are better equipped to deliver longer lasting value about the original data and derived knowledge (information) from them. The "NASA Earth Science Data Preservation Content Specification" is an attempt to define the content necessary for long-term preservation. It requires new lifecycle infrastructure approach along with content repositories to accommodate artifacts other than just raw data. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) setup an open-source Preservation System capable of long-term archive of digital content to augment its raw data holding. This repository is being used for such missions as HIRDLS, UARS, TOMS, OMI, among others. We will provide a status of this implementation; report on challenges, lessons learned, and detail our plans for future evolution to include other missions and services.

  3. Nature Research journals reproducibility policies and initiatives in the Earth sciences

    NASA Astrophysics Data System (ADS)

    VanDecar, J. C.

    2016-12-01

    The Nature Research journals strongly support the long-term endeavour by funders, institutions, researchers and publishers toward increasing the reliability and reproducibility of published research. In the Earth, space and environmental sciences this mainly takes the form of ensuring that underlying data and methods in each manuscript are made as transparent and accessible as possible. Supporting data must be made available to editors and peer reviewers at the time of submission for the purposes of evaluating each manuscript. But the preferred way to share data sets is via public repositories. When appropriate community repositories are available, we strongly encourage authors to deposit their data prior to publication. We also now require that a statement be included in each manuscript, under the heading "Data availability", indicating whether and how the data can be accessed, including any restrictions to access. To allow authors to describe their experimental design and methods in as much detail as necessary, the Nature Research journals have effectively abolished space restrictions on online methods sections. To further increase transparency, we also encourage authors to provide tables of the data behind graphs and figures as Source Data. This builds on our established data-deposition policy for specific experiments and large data sets. The Source Data is made available directly from the figure legend, for easy access. We also require that details of geological samples and palaeontological specimens include clear provenance information to ensure full transparency of the research methods. Palaeontological and type specimens must be deposited in a recognised museum or collection to permit free access by other researchers in perpetuity. Finally, authors must make available upon request, to editors and reviewers, any previously unreported custom computer code used to generate results that are reported in the paper and central to its main claims. For all studies using custom code that is deemed central to the conclusions, a statement must be included, under the heading "Code availability", indicating whether and how the code can be accessed, including any restrictions to access.

  4. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  5. ACToR: Aggregated Computational Toxicology Resource (T) ...

    EPA Pesticide Factsheets

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col

  6. Inert matrix fuel neutronic, thermal-hydraulic, and transient behavior in a light water reactor

    NASA Astrophysics Data System (ADS)

    Carmack, W. J.; Todosow, M.; Meyer, M. K.; Pasamehmetoglu, K. O.

    2006-06-01

    Currently, commercial power reactors in the United States operate on a once-through or open cycle, with the spent nuclear fuel eventually destined for long-term storage in a geologic repository. Since the fissile and transuranic (TRU) elements in the spent nuclear fuel present a proliferation risk, limit the repository capacity, and are the major contributors to the long-term toxicity and dose from the repository, methods and systems are needed to reduce the amount of TRU that will eventually require long-term storage. An option to achieve a reduction in the amount, and modify the isotopic composition of TRU requiring geological disposal is 'burning' the TRU in commercial light water reactors (LWRs) and/or fast reactors. Fuel forms under consideration for TRU destruction in light water reactors (LWRs) include mixed-oxide (MOX), advanced mixed-oxide, and inert matrix fuels. Fertile-free inert matrix fuel (IMF) has been proposed for use in many forms and studied by several researchers. IMF offers several advantages relative to MOX, principally it provides a means for reducing the TRU in the fuel cycle by burning the fissile isotopes and transmuting the minor actinides while producing no new TRU elements from fertile isotopes. This paper will present and discuss the results of a four-bundle, neutronic, thermal-hydraulic, and transient analyses of proposed inert matrix materials in comparison with the results of similar analyses for reference UOX fuel bundles. The results of this work are to be used for screening purposes to identify the general feasibility of utilizing specific inert matrix fuel compositions in existing and future light water reactors. Compositions identified as feasible using the results of these analyses still require further detailed neutronic, thermal-hydraulic, and transient analysis study coupled with rigorous experimental testing and qualification.

  7. Office of Science and Technology&International Year EndReport - 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodvarsson, G.S.

    2005-10-27

    Source Term, Materials Performance, Radionuclide Getters, Natural Barriers, and Advanced Technologies, a brief introduction in each section describes the overall organization and goals of each program area. All of these areas have great potential for improving our understanding of the safety performance of the proposed Yucca Mountain repository, as processes within these areas are generally very conservatively represented in the Total System Performance Assessment. In addition, some of the technology thrust areas in particular may enhance system efficiency and reduce risk to workers. Thus, rather modest effort in the S&T Program could lead to large savings in the lifetime repositorymore » total cost and significantly enhanced understanding of the behavior of the proposed Yucca Mountain repository, without safety being compromised, and in some instances being enhanced. An overall strength of the S&T Program is the significant amount of integration that has already been achieved after two years of research. As an example (illustrated in Figure 1), our understanding of the behavior of the total waste isolation system has been enhanced through integration of the Source Term, Materials Performance, and Natural Barriers Thrust areas. All three thrust areas contribute to the integration of different processes in the in-drift environment. These processes include seepage into the drift, dust accumulation on the waste package, brine formation and precipitation on the waste package, mass transfer through the fuel cladding, changes in the seepage-water chemical composition, and transport of released radionuclides through the invert and natural barriers. During FY2005, each of our program areas assembled a team of external experts to conduct an independent review of their respective projects, research directions, and emphasis. In addition, the S&T Program as a whole was independently reviewed by the S&T Programmatic Evaluation Panel. As a result of these reviews, adjustments to the S&T Program will be implemented in FY2006 to ensure that the Program is properly aligned with OCRWM's priorities. Also during FY2005, several programmatic documents were published, including the Science and Technology Program Strategic Plan, the Science and Technology Program Management Plan, and the Science and Technology Program Plan. These and other communication products are available on the OCRWM web site under the Science and Technology section (http://www.ocrwm.doe.gov/osti/index.shtml).« less

  8. Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buschaert, S.; Lesoille, S.; Bertrand, J.

    2012-07-01

    The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less

  9. Data Publishing and Sharing Via the THREDDS Data Repository

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.

    2007-12-01

    The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.

  10. Brine and Gas Flow Patterns Between Excavated Areas and Disturbed Rock Zone in the 1996 Performance Assessment for the Waste Isolation Pilot Plant for a Single Drilling Intrusion that Penetrates Repository and Castile Brine Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ECONOMY,KATHLEEN M.; HELTON,JON CRAIG; VAUGHN,PALMER

    1999-10-01

    The Waste Isolation Pilot Plant (WIPP), which is located in southeastern New Mexico, is being developed for the geologic disposal of transuranic (TRU) waste by the U.S. Department of Energy (DOE). Waste disposal will take place in panels excavated in a bedded salt formation approximately 2000 ft (610 m) below the land surface. The BRAGFLO computer program which solves a system of nonlinear partial differential equations for two-phase flow, was used to investigate brine and gas flow patterns in the vicinity of the repository for the 1996 WIPP performance assessment (PA). The present study examines the implications of modeling assumptionsmore » used in conjunction with BRAGFLO in the 1996 WIPP PA that affect brine and gas flow patterns involving two waste regions in the repository (i.e., a single waste panel and the remaining nine waste panels), a disturbed rock zone (DRZ) that lies just above and below these two regions, and a borehole that penetrates the single waste panel and a brine pocket below this panel. The two waste regions are separated by a panel closure. The following insights were obtained from this study. First, the impediment to flow between the two waste regions provided by the panel closure model is reduced due to the permeable and areally extensive nature of the DRZ adopted in the 1996 WIPP PA, which results in the DRZ becoming an effective pathway for gas and brine movement around the panel closures and thus between the two waste regions. Brine and gas flow between the two waste regions via the DRZ causes pressures between the two to equilibrate rapidly, with the result that processes in the intruded waste panel are not isolated from the rest of the repository. Second, the connection between intruded and unintruded waste panels provided by the DRZ increases the time required for repository pressures to equilibrate with the overlying and/or underlying units subsequent to a drilling intrusion. Third, the large and areally extensive DRZ void volumes is a significant source of brine to the repository, which is consumed in the corrosion of iron and thus contributes to increased repository pressures. Fourth, the DRZ itself lowers repository pressures by providing storage for gas and access to additional gas storage in areas of the repository. Fifth, given the pathway that the DRZ provides for gas and brine to flow around the panel closures, isolation of the waste panels by the panel closures was not essential to compliance with the U.S. Environment Protection Agency's regulations in the 1996 WIPP PA.« less

  11. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.

  12. Review breathes life into Yucca site

    NASA Astrophysics Data System (ADS)

    Gwynne, Peter

    2014-12-01

    A review by the Nuclear Regulatory Commission (NRC) on the long-term safety of the Yucca Mountain repository for nuclear waste in Nevada has improved the chances that it may go ahead, despite being mothballed by the administration of US president Barack Obama back in 2010.

  13. LingoBee--Crowd-Sourced Mobile Language Learning in the Cloud

    ERIC Educational Resources Information Center

    Petersen, Sobah Abbas; Procter-Legg, Emma; Cacchione, Annamaria

    2013-01-01

    This paper describes three case studies, where language learners were invited to use "LingoBee" as a means of supporting their language learning. LingoBee is a mobile app that provides user-generated language content in a cloud-based shared repository. Assuming that today's students are mobile savvy and "Digital Natives" able…

  14. LingoBee: Engaging Mobile Language Learners through Crowd-Sourcing

    ERIC Educational Resources Information Center

    Petersen, Sobah Abbas; Procter-Legg, Emma; Cacchione, Annamaria

    2014-01-01

    This paper describes three case studies, where language learners were invited to use "LingoBee" as a means of supporting their language learning. LingoBee is a mobile app that provides user-generated language content in a cloud-based shared repository. Assuming that today's students are mobile savvy and "Digital Natives" able…

  15. A Culture Model as Mediator and Repository Source for Innovation

    ERIC Educational Resources Information Center

    Mohammadisadr, Mohammad; Siadat, Seyed Ali; Azizollah, Arbabisarjou; Ebrahim, Ebrahimitabas

    2012-01-01

    As innovation has become one of the most important competitive advantages, academic practitioner's interest in the matter has increased. But, still the question "why lots of organizations fail in their path of article to be innovative" is remained unanswered. In this, among many factors influencing innovation capacity of an organization;…

  16. Genomewide association study of ionomic traits on diverse soybean populations from germplasm collections

    USDA-ARS?s Scientific Manuscript database

    The elemental content of a soybean seed is a determined by both genetic and environmental factors and is an important component of its nutritional value. The elemental content is stable, making the samples stored in germplasm repositories an intriguing source of experimental material. To test the ef...

  17. Community based research for an urban recreation application of benefits-based management

    Treesearch

    William T. Borrie; Joseph W. Roggenbuck

    1995-01-01

    Benefits-based management is an approach to park and recreation management that focuses on the positive outcomes of engaging in recreational experiences. Because one class of possible benefits accrue to the community, a philosophical framework is discussed suggesting that communities are themselves the primary sources, generators, and repositories of knowledge....

  18. National History Day in Arizona 2003 Theme Supplement: Rights and Responsibilities.

    ERIC Educational Resources Information Center

    Goen, Wendi, Comp.; Devine, Laurie, Comp.

    Arizona's archives, libraries, and museums contain a wealth of source material that can be applied to local, regional, and national topics pertaining to the 2003 National History Day theme, rights and responsibilities. Repositories from around the state share ideas and resources that are available to National History Day students. So that…

  19. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... nuclear material, facility and operator licenses. (a) If the Director, Office of Nuclear Reactor... repository operations area under parts 60 or 63 of this chapter, the Director, Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear Material Safety and Safeguards, or...

  20. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... nuclear material, facility and operator licenses. (a) If the Director, Office of Nuclear Reactor... repository operations area under parts 60 or 63 of this chapter, the Director, Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear Material Safety and Safeguards, or...

  1. Finite-Length Line Source Superposition Model (FLLSSM)

    NASA Astrophysics Data System (ADS)

    1980-03-01

    A linearized thermal conduction model was developed to economically determine media temperatures in geologic repositories for nuclear wastes. Individual canisters containing either high level waste or spent fuel assemblies were represented as finite length line sources in a continuous media. The combined effects of multiple canisters in a representative storage pattern were established at selected points of interest by superposition of the temperature rises calculated for each canister. The methodology is outlined and the computer code FLLSSM which performs required numerical integrations and superposition operations is described.

  2. Preservation of Earth Science Data History with Digital Content Repository Technology

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.

    2011-12-01

    An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.

  3. 10 CFR Appendix I to Part 960 - NRC and EPA Requirements for Postclosure Repository Performance

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... multiplied by 10. The basis for Table 2 is an upper limit on long term risks of 1,000 health effects over 10... areas; and design of disposal systems to allow future recovery of wastes. The guidelines will be revised...

  4. 10 CFR Appendix I to Part 960 - NRC and EPA Requirements for Postclosure Repository Performance

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... multiplied by 10. The basis for Table 2 is an upper limit on long term risks of 1,000 health effects over 10... areas; and design of disposal systems to allow future recovery of wastes. The guidelines will be revised...

  5. 10 CFR Appendix I to Part 960 - NRC and EPA Requirements for Postclosure Repository Performance

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... multiplied by 10. The basis for Table 2 is an upper limit on long term risks of 1,000 health effects over 10... areas; and design of disposal systems to allow future recovery of wastes. The guidelines will be revised...

  6. Use of Groundwater Lifetime Expectancy for the Performance Assessment of Deep Geologic Radioactive Waste Repositories.

    NASA Astrophysics Data System (ADS)

    Cornaton, F.; Park, Y.; Normani, S.; Sudicky, E.; Sykes, J.

    2005-12-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, the safety of the host repository depends on two main barriers: the engineered barrier and the natural geological barrier. If radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from the repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. In a second step, the risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The proposed methodology is applied in the context of a typical Canadian Shield environment. Based on a statistically-generated three-dimension network of fracture zones embedded in the granitic host rock, the sensitivity and the uncertainty of lifetime expectancy to the hydraulic and dispersive properties of the fracture network, including the impact of conditioning via their surface expressions, is computed in order to demonstrate the utility of the methodology.

  7. Reconsolidated Salt as a Geotechnical Barrier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Francis D.; Gadbury, Casey

    Salt as a geologic medium has several attributes favorable to long-term isolation of waste placed in mined openings. Salt formations are largely impermeable and induced fractures heal as stress returns to equilibrium. Permanent isolation also depends upon the ability to construct geotechnical barriers that achieve nearly the same high-performance characteristics attributed to the native salt formation. Salt repository seal concepts often include elements of reconstituted granular salt. As a specific case in point, the Waste Isolation Pilot Plant recently received regulatory approval to change the disposal panel closure design from an engineered barrier constructed of a salt-based concrete to onemore » that employs simple run-of-mine salt and temporary bulkheads for isolation from ventilation. The Waste Isolation Pilot Plant is a radioactive waste disposal repository for defense-related transuranic elements mined from the Permian evaporite salt beds in southeast New Mexico. Its approved shaft seal design incorporates barrier components comprising salt-based concrete, bentonite, and substantial depths of crushed salt compacted to enhance reconsolidation. This paper will focus on crushed salt behavior when applied as drift closures to isolate disposal rooms during operations. Scientific aspects of salt reconsolidation have been studied extensively. The technical basis for geotechnical barrier performance has been strengthened by recent experimental findings and analogue comparisons. The panel closure change was accompanied by recognition that granular salt will return to a physical state similar to the halite surrounding it. Use of run-of-mine salt ensures physical and chemical compatibility with the repository environment and simplifies ongoing disposal operations. Our current knowledge and expected outcome of research can be assimilated with lessons learned to put forward designs and operational concepts for the next generation of salt repositories. Mined salt repositories have the potential to isolate permanently vast inventories of radioactive and hazardous wastes.« less

  8. Digital Rocks Portal: a Sustainable Platform for Data Management, Analysis and Remote Visualization of Volumetric Images of Porous Media

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.

    2017-12-01

    Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.

  9. Geoscience Digital Data Resource and Repository Service

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.; Schuster, D.; Hou, C. Y.

    2017-12-01

    The open availability and wide accessibility of digital data sets is becoming the norm for geoscience research. The National Science Foundation (NSF) instituted a data management planning requirement in 2011, and many scientific publishers, including the American Geophysical Union and the American Meteorological Society, have recently implemented data archiving and citation policies. Many disciplinary data facilities exist around the community to provide a high level of technical support and expertise for archiving data of particular kinds, or for particular projects. However, a significant number of geoscience research projects do not have the same level of data facility support due to a combination of several factors, including the research project's size, funding limitations, or topic scope that does not have a clear facility match. These projects typically manage data on an ad hoc basis without limited long-term management and preservation procedures. The NSF is supporting a workshop to be held in Summer of 2018 to develop requirements and expectations for a Geoscience Digital Data Resource and Repository Service (GeoDaRRS). The vision for the prospective GeoDaRRS is to complement existing NSF-funded data facilities by providing: 1) data management planning support resources for the general community, and 2) repository services for researchers who have data that do not fit in any existing repository. Functionally, the GeoDaRRS would support NSF-funded researchers in meeting data archiving requirements set by the NSF and publishers for geosciences, thereby ensuring the availability of digital data for use and reuse in scientific research going forward. This presentation will engage the AGU community in discussion about the needs for a new digital data repository service, specifically to inform the forthcoming GeoDaRRS workshop.

  10. Long-term safety assessment of trench-type surface repository at Chernobyl, Ukraine - computer model and comparison with results from simplified models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-07-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, the surface repository for solid low and intermediate level waste (LILW) is still being operated but its maximum capacity is nearly reached. Long-existing plans for increasing the capacity of the facility shall be implemented in the framework of the European Commission INSC Programme (Instrument for Nuclear Safety Co-operation). Within the first phase of this project, DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safetymore » analysis report (PSAR) for a future extended facility based on the planned enlargement. In addition to a detailed mathematical model, also simplified models have been developed to verify results of the former one and enhance confidence in the results. Comparison of the results show that - depending on the boundary conditions - simplifications like modeling the multi trench repository as one generic trench might have very limited influence on the overall results compared to the general uncertainties associated with respective long-term calculations. In addition to their value in regard to verification of more complex models which is important to increase confidence in the overall results, such simplified models can also offer the possibility to carry out time consuming calculations like probabilistic calculations or detailed sensitivity analysis in an economic manner. (authors)« less

  11. Canada's Deep Geological Repository for Used Nuclear Fuel - Geo-scientific Site Evaluation Process - 13117

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blyth, Alec; Ben Belfadhel, Mahrez; Hirschorn, Sarah

    2013-07-01

    The Nuclear Waste Management Organization (NWMO) is responsible for implementing Adaptive Phased Management (APM), the approach selected by the Government of Canada for long-term management of used nuclear fuel generated by Canadian nuclear reactors. The ultimate objective of APM is the centralized containment and isolation of Canada's used nuclear fuel in a Deep Geological Repository in a suitable rock formation at a depth of approximately 500 meters (m) (1,640 feet [ft]). In May 2010, the NWMO published a nine-step site selection process that serves as the road map to decision-making on the location for the deep geological repository. The safetymore » and appropriateness of any potential site will be assessed against a number of factors, both technical and social in nature. The selected site will be one that can be demonstrated to be able to safely contain and isolate used nuclear fuel, protecting humans and the environment over the very long term. The geo-scientific suitability of potential candidate sites will be assessed in a stepwise manner following a progressive and thorough site evaluation process that addresses a series of geo-scientific factors revolving around five safety functions. The geo-scientific site evaluation process includes: Initial Screenings; Preliminary Assessments; and Detailed Site Evaluations. As of November 2012, 22 communities have entered the site selection process (three in northern Saskatchewan and 18 in northwestern and southwestern Ontario). (authors)« less

  12. Performance assessments of nuclear waste repositories--A dialogue on their value and limitations

    USGS Publications Warehouse

    Ewing, Rodney C.; Tierney, Martin S.; Konikow, Leonard F.; Rechard, Rob P.

    1999-01-01

    Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, sixstep process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA, they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.

  13. The chemical behavior of the transuranic elements and the barrier function in natural aquifer systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jewett, J.R.

    1997-09-17

    In a geological repository for long-lived radioactive wastes, such as actinides and certain fission products, most of the stored radionuclides remain immobile in the particular geological formation. If any of these could possibly become mobile, only trace concentrations of a few radionuclides would result. Nevertheless, with an inventory in the repository of many tonnes of transuranic elements, the amounts that could disperse cannot be neglected. A critical assessment of the chemical behavior of these nuclides, especially their migration properties in the aquifer system around the repository site, is mandatory for analysis of the long-term safety. The chemistry requited for thismore » includes many geochemical multicomponent reactions that are so far only partially understood and [which] therefore can be quantified only incompletely. A few of these reactions have been discussed in this paper based on present knowledge. If a comprehensive discussion of the subject is impossible because of this [lack of information], then an attempt to emphasize the importance of the predominant geochemical reactions of the transuranic elements in various aquifer systems should be made.« less

  14. A Fruit of Yucca Mountain: The Remote Waste Package Closure System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Skinner; Greg Housley; Colleen Shelton-Davis

    2011-11-01

    Was the death of the Yucca Mountain repository the fate of a technical lemon or a political lemon? Without caution, this debate could lure us away from capitalizing on the fruits of the project. In March 2009, Idaho National Laboratory (INL) successfully demonstrated the Waste Package Closure System, a full-scale prototype system for closing waste packages that were to be entombed in the now abandoned Yucca Mountain repository. This article describes the system, which INL designed and built, to weld the closure lids on the waste packages, nondestructively examine the welds using four different techniques, repair the welds if necessary,more » mitigate crack initiating stresses in the surfaces of the welds, evacuate and backfill the packages with an inert gas, and perform all of these tasks remotely. As a nation, we now have a proven method for securely sealing nuclear waste packages for long term storage—regardless of whether or not the future destination for these packages will be an underground repository. Additionally, many of the system’s features and concepts may benefit other remote nuclear applications.« less

  15. Nuclear Waste Facing the Test of Time: The Case of the French Deep Geological Repository Project.

    PubMed

    Poirot-Delpech, Sophie; Raineau, Laurence

    2016-12-01

    The purpose of this article is to consider the socio-anthropological issues raised by the deep geological repository project for high-level, long-lived nuclear waste. It is based on fieldwork at a candidate site for a deep storage project in eastern France, where an underground laboratory has been studying the feasibility of the project since 1999. A project of this nature, based on the possibility of very long containment (hundreds of thousands of years, if not longer), involves a singular form of time. By linking project performance to geology's very long timescale, the project attempts "jump" in time, focusing on a far distant future, without understanding it in terms of generations. But these future generations remain measurements of time on the surface, where the issue of remembering or forgetting the repository comes to the fore. The nuclear waste geological storage project raises questions that neither politicians nor scientists, nor civil society, have ever confronted before. This project attempts to address a problem that exists on a very long timescale, which involves our responsibility toward generations in the far future.

  16. Uranium (VI) solubility in carbonate-free ERDA-6 brine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucchini, Jean-francois; Khaing, Hnin; Reed, Donald T

    2010-01-01

    When present, uranium is usually an element of importance in a nuclear waste repository. In the Waste Isolation Pilot Plant (WIPP), uranium is the most prevalent actinide component by mass, with about 647 metric tons to be placed in the repository. Therefore, the chemistry of uranium, and especially its solubility in the WIPP conditions, needs to be well determined. Long-term experiments were performed to measure the solubility of uranium (VI) in carbonate-free ERDA-6 brine, a simulated WIPP brine, at pC{sub H+} values between 8 and 12.5. These data, obtained from the over-saturation approach, were the first repository-relevant data for themore » VI actinide oxidation state. The solubility trends observed pointed towards low uranium solubility in WIPP brines and a lack of amphotericity. At the expected pC{sub H+} in the WIPP ({approx} 9.5), measured uranium solubility approached 10{sup -7} M. The objective of these experiments was to establish a baseline solubility to further investigate the effects of carbonate complexation on uranium solubility in WIPP brines.« less

  17. Trace element storage capacity of sediments in dead Posidonia oceanica mat from a chronically contaminated marine ecosystem.

    PubMed

    Di Leonardo, Rossella; Mazzola, Antonio; Cundy, Andrew B; Tramati, Cecilia Doriana; Vizzini, Salvatrice

    2017-01-01

    Posidonia oceanica mat is considered a long-term bioindicator of contamination. Storage and sequestration of trace elements and organic carbon (C org ) were assessed in dead P. oceanica mat and bare sediments from a highly polluted coastal marine area (Augusta Bay, central Mediterranean). Sediment elemental composition and sources of organic matter have been altered since the 1950s. Dead P. oceanica mat displayed a greater ability to bury and store trace elements and C org than nearby bare sediments, acting as a long-term contaminant sink over the past 120 yr. Trace elements, probably associated with the mineral fraction, were stabilized and trapped despite die-off of the overlying P. oceanica meadow. Mat deposits registered historic contamination phases well, confirming their role as natural archives for recording trace element trends in marine coastal environments. This sediment typology is enriched with seagrass-derived refractory organic matter, which acts mainly as a diluent of trace elements. Bare sediments showed evidence of inwash of contaminated sediments via reworking; more rapid and irregular sediment accumulation; and, because of the high proportions of labile organic matter, a greater capacity to store trace elements. Through different processes, both sediment typologies represent a repository for chemicals and may pose a risk to the marine ecosystem as a secondary source of contaminants in the case of sediment dredging or erosion. Environ Toxicol Chem 2017;36:49-58. © 2016 SETAC. © 2016 SETAC.

  18. The MIMIC Code Repository: enabling reproducibility in critical care research.

    PubMed

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Cryopreservation in fish: current status and pathways to quality assurance and quality control in repository development

    PubMed Central

    Torres, Leticia; Hu, E.; Tiersch, Terrence R.

    2017-01-01

    Cryopreservation in aquatic species in general has been constrained to research activities for more than 60 years. Although the need for application and commercialisation pathways has become clear, the lack of comprehensive quality assurance and quality control programs has impeded the progress of the field, delaying the establishment of germplasm repositories and commercial-scale applications. In this review we focus on the opportunities for standardisation in the practices involved in the four main stages of the cryopreservation process: (1) source, housing and conditioning of fish; (2) sample collection and preparation; (3) freezing and cryogenic storage of samples; and (4) egg collection and use of thawed sperm samples. In addition, we introduce some key factors that would assist the transition to commercial-scale, high-throughput application. PMID:26739583

  20. Using GO-WAR for mining cross-ontology weighted association rules.

    PubMed

    Agapito, Giuseppe; Cannataro, Mario; Guzzi, Pietro Hiram; Milano, Marianna

    2015-07-01

    The Gene Ontology (GO) is a structured repository of concepts (GO terms) that are associated to one or more gene products. The process of association is referred to as annotation. The relevance and the specificity of both GO terms and annotations are evaluated by a measure defined as information content (IC). The analysis of annotated data is thus an important challenge for bioinformatics. There exist different approaches of analysis. From those, the use of association rules (AR) may provide useful knowledge, and it has been used in some applications, e.g. improving the quality of annotations. Nevertheless classical association rules algorithms do not take into account the source of annotation nor the importance yielding to the generation of candidate rules with low IC. This paper presents GO-WAR (Gene Ontology-based Weighted Association Rules) a methodology for extracting weighted association rules. GO-WAR can extract association rules with a high level of IC without loss of support and confidence from a dataset of annotated data. A case study on using of GO-WAR on publicly available GO annotation datasets is used to demonstrate that our method outperforms current state of the art approaches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Separated by a common language: awareness of term usage differences between languages and disciplines in biopreparedness.

    PubMed

    Andersson, M Gunnar; Tomuzia, Katharina; Löfström, Charlotta; Appel, Bernd; Bano, Luca; Keremidis, Haralampos; Knutsson, Rickard; Leijon, Mikael; Lövgren, Susanna Ekströmer; De Medici, Dario; Menrath, Andrea; van Rotterdam, Bart J; Wisselink, Henk J; Barker, Gary C

    2013-09-01

    Preparedness for bioterrorism is based on communication between people in organizations who are educated and trained in several disciplines, including law enforcement, health, and science. Various backgrounds, cultures, and vocabularies generate difficulties in understanding and interpretating terms and concepts, which may impair communication. This is especially true in emergency situations, in which the need for clarity and consistency is vital. The EU project AniBioThreat initiated methods and made a rough estimate of the terms and concepts that are crucial for an incident, and a pilot database with key terms and definitions has been constructed. Analysis of collected terms and sources has shown that many of the participating organizations use various international standards in their area of expertise. The same term often represents different concepts in the standards from different sectors, or, alternatively, different terms were used to represent the same or similar concepts. The use of conflicting terminology can be problematic for decision makers and communicators in planning and prevention or when handling an incident. Since the CBRN area has roots in multiple disciplines, each with its own evolving terminology, it may not be realistic to achieve unequivocal communication through a standardized vocabulary and joint definitions for words from common language. We suggest that a communication strategy should include awareness of alternative definitions and ontologies and the ability to talk and write without relying on the implicit knowledge underlying specialized jargon. Consequently, cross-disciplinary communication skills should be part of training of personnel in the CBRN field. In addition, a searchable repository of terms and definitions from relevant organizations and authorities would be a valuable addition to existing glossaries for improving awareness concerning bioterrorism prevention planning.

  2. Separated by a Common Language: Awareness of Term Usage Differences Between Languages and Disciplines in Biopreparedness

    PubMed Central

    Tomuzia, Katharina; Löfström, Charlotta; Appel, Bernd; Bano, Luca; Keremidis, Haralampos; Knutsson, Rickard; Leijon, Mikael; Lövgren, Susanna Ekströmer; De Medici, Dario; Menrath, Andrea; van Rotterdam, Bart J.; Wisselink, Henk J.; Barker, Gary C.

    2013-01-01

    Preparedness for bioterrorism is based on communication between people in organizations who are educated and trained in several disciplines, including law enforcement, health, and science. Various backgrounds, cultures, and vocabularies generate difficulties in understanding and interpretating terms and concepts, which may impair communication. This is especially true in emergency situations, in which the need for clarity and consistency is vital. The EU project AniBioThreat initiated methods and made a rough estimate of the terms and concepts that are crucial for an incident, and a pilot database with key terms and definitions has been constructed. Analysis of collected terms and sources has shown that many of the participating organizations use various international standards in their area of expertise. The same term often represents different concepts in the standards from different sectors, or, alternatively, different terms were used to represent the same or similar concepts. The use of conflicting terminology can be problematic for decision makers and communicators in planning and prevention or when handling an incident. Since the CBRN area has roots in multiple disciplines, each with its own evolving terminology, it may not be realistic to achieve unequivocal communication through a standardized vocabulary and joint definitions for words from common language. We suggest that a communication strategy should include awareness of alternative definitions and ontologies and the ability to talk and write without relying on the implicit knowledge underlying specialized jargon. Consequently, cross-disciplinary communication skills should be part of training of personnel in the CBRN field. In addition, a searchable repository of terms and definitions from relevant organizations and authorities would be a valuable addition to existing glossaries for improving awareness concerning bioterrorism prevention planning. PMID:23971818

  3. Looking for Skeletons in the Data Centre `Cupboard': How Repository Certification Can Help

    NASA Astrophysics Data System (ADS)

    Sorvari, S.; Glaves, H.

    2017-12-01

    There has been a national geoscience repository at the British Geological Survey (or one of its previous incarnations) almost since its inception in 1835. This longevity has resulted in vast amounts of analogue material and, more recently, digital data some of which has been collected by our scientists but much more has been acquired either through various legislative obligations or donated from various sources. However, the role and operation of the UK National Geoscience Data Centre (NGDC) in the 21st Century is very different to that of the past, with new systems and procedures dealing with predominantly digital data. A web-based ingestion portal allows users to submit their data directly to the NGDC while online services provide discovery and access to data and derived products. Increasingly we are also required to implement an array of standards e.g. ISO, OGC, W3C, best practices e.g. FAIR and legislation e.g. EU INSPIRE Directive; whilst at the same time needing to justifying our very existence to our funding agency and hosting organisation. External pressures to demonstrate that we can be recognised as a trusted repository by researchers, various funding agencies, publishers and other related entities have forced us to look at how we function, and to benchmark our operations against those of other organisations and current relevant standards such as those laid down by different repository certification processes. Following an assessment of the various options, the WDS/DSA certification process was selected as the most appropriate route for accreditation of NGDC as a trustworthy repository. It provided a suitable framework for reviewing the current systems, procedures and best practices. Undertaking this process allowed us to identify where the NGDC already has robust systems in place and where there were gaps and deficiencies in current practices. The WDS/DSA assessment process also helped to reinforce best practice throughout the NGDC and demonstrated that many of the recognised and required procedures and standards for recognition as a trusted repository were already in place, even if they were not always followed!

  4. Data Sharing in Astrobiology: the Astrobiology Habitable Environments Database (AHED)

    NASA Astrophysics Data System (ADS)

    Bristow, T.; Lafuente Valverde, B.; Keller, R.; Stone, N.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.

    2016-12-01

    Astrobiology is a multidisciplinary area of scientific research focused on studying the origins of life on Earth and the conditions under which life might have emerged elsewhere in the universe. The understanding of complex questions in astrobiology requires integration and analysis of data spanning a range of disciplines including biology, chemistry, geology, astronomy and planetary science. However, the lack of a centralized repository makes it difficult for astrobiology teams to share data and benefit from resultant synergies. Moreover, in recent years, federal agencies are requiring that results of any federally funded scientific research must be available and useful for the public and the science community. Astrobiology, as any other scientific discipline, needs to respond to these mandates. The Astrobiology Habitable Environments Database (AHED) is a central, high quality, long-term searchable repository designed to help the community by promoting the integration and sharing of all the data generated by these diverse disciplines. AHED provides public and open-access to astrobiology-related research data through a user-managed web portal implemented using the open-source software The Open Data Repository's (ODR) Data Publisher [1]. ODR-DP provides a user-friendly interface that research teams or individual scientists can use to design, populate and manage their own databases or laboratory notebooks according to the characteristics of their data. AHED is then a collection of databases housed in the ODR framework that store information about samples, along with associated measurements, analyses, and contextual information about field sites where samples were collected, the instruments or equipment used for analysis, and people and institutions involved in their collection. Advanced graphics are implemented together with advanced online tools for data analysis (e.g. R, MATLAB, Project Jupyter-http://jupyter.org). A permissions system will be put in place so that as data are being actively collected and interpreted, they will remain proprietary. A citation system will allow research data to be used and appropriately referenced by other researchers after the data are made public. This project is supported by SERA and NASA NNX11AP82A, MSL. [1] Stone et al. (2016) AGU, submitted.

  5. Data Citation Concept for CMIP6

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Toussaint, F.; Lautenschlager, M.; Lawrence, B.

    2015-12-01

    There is a broad consensus among data centers and scientific publishers on Force 11's 'Joint Declaration of Data Citation Principles'. To put these principles into operation is not always as straight forward. The focus for CMIP6 data citations lies on the citation of data created by others and used in an analysis underlying the article. And for this source data usually no article of the data creators is available ('stand-alone data publication'). The planned data citation granularities are model data (data collections containing all datasets provided for the project by a single model) and experiment data (data collections containing all datasets for a scientific experiment run by a single model). In case of large international projects or activities like CMIP, the data is commonly stored and disseminated by multiple repositories in a federated data infrastructure such as the Earth System Grid Federation (ESGF). The individual repositories are subject to different institutional and national policies. A Data Management Plan (DMP) will define a certain standard for the repositories including data handling procedures. Another aspect of CMIP data, relevant for data citations, is its dynamic nature. For such large data collections, datasets are added, revised and retracted for years, before the data collection becomes stable for a data citation entity including all model or simulation data. Thus, a critical issue for ESGF is data consistency, requiring thorough dataset versioning to enable the identification of the data collection in the cited version. Currently, the ESGF is designed for accessing the latest dataset versions. Data citation introduces the necessity to support older and retracted dataset versions by storing metadata even beyond data availability (data unpublished in ESGF). Apart from ESGF, other infrastructure components exist for CMIP, which provide information that has to be connected to the CMIP6 data, e.g. ES-DOC providing information on models and simulations and the IPCC Data Distribution Centre (DDC) storing a subset of data together with available metadata (ES-DOC) for the long-term reuse of the interdisciplinary community. Other connections exist to standard project vocabularies, to personal identifiers (e.g. ORCID), or to data products (including provenance information).

  6. Associating clinical archetypes through UMLS Metathesaurus term clusters.

    PubMed

    Lezcano, Leonardo; Sánchez-Alonso, Salvador; Sicilia, Miguel-Angel

    2012-06-01

    Clinical archetypes are modular definitions of clinical data, expressed using standard or open constraint-based data models as the CEN EN13606 and openEHR. There is an increasing archetype specification activity that raises the need for techniques to associate archetypes to support better management and user navigation in archetype repositories. This paper reports on a computational technique to generate tentative archetype associations by mapping them through term clusters obtained from the UMLS Metathesaurus. The terms are used to build a bipartite graph model and graph connectivity measures can be used for deriving associations.

  7. Six methodological steps to build medical data warehouses for research.

    PubMed

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  8. Measuring Trust: Standards for Trusted Digital Repositories

    ERIC Educational Resources Information Center

    Dryden, Jean

    2011-01-01

    Ensuring the long-term preservation and use of born-digital and digitized records of enduring value has preoccupied archivists and their cultural heritage siblings for several decades. The professional literature of the 1980s and 1990s bemoans the challenges posed by rapid technological change (and the concomitant obsolescence of hardware and…

  9. Detecting people of interest from internet data sources

    NASA Astrophysics Data System (ADS)

    Cardillo, Raymond A.; Salerno, John J.

    2006-04-01

    In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.

  10. A systematic review of online resources to support patient decision-making for full-thickness rectal prolapse surgery.

    PubMed

    Fowler, G E; Baker, D M; Lee, M J; Brown, S R

    2017-11-01

    The internet is becoming an increasingly popular resource to support patient decision-making outside of the clinical encounter. The quality of online health information is variable and largely unregulated. The aim of this study was to assess the quality of online resources to support patient decision-making for full-thickness rectal prolapse surgery. This systematic review was registered on the PROSPERO database (CRD42017058319). Searches were performed on Google and specialist decision aid repositories using a pre-defined search strategy. Sources were analysed according to three measures: (1) their readability using the Flesch-Kincaid Reading Ease score, (2) DISCERN score and (3) International Patient Decision Aids Standards (IPDAS) minimum standards criteria score (IPDASi, v4.0). Overall, 95 sources were from Google and the specialist decision aid repositories. There were 53 duplicates removed, and 18 sources did not meet the pre-defined eligibility criteria, leaving 24 sources included in the full-text analysis. The mean Flesch-Kincaid Reading Ease score was higher than recommended for patient education materials (48.8 ± 15.6, range 25.2-85.3). Overall quality of sources supporting patient decision-making for full-thickness rectal prolapse surgery was poor (median DISCERN score 1/5 ± 1.18, range 1-5). No sources met minimum decision-making standards (median IPDASi score 5/12 ± 2.01, range 1-8). Currently, easily accessible online health information to support patient decision-making for rectal surgery is of poor quality, difficult to read and does not support shared decision-making. It is recommended that professional bodies and medical professionals seek to develop decision aids to support decision-making for full-thickness rectal prolapse surgery.

  11. The Function Biomedical Informatics Research Network Data Repository

    PubMed Central

    Keator, David B.; van Erp, Theo G.M.; Turner, Jessica A.; Glover, Gary H.; Mueller, Bryon A.; Liu, Thomas T.; Voyvodic, James T.; Rasmussen, Jerod; Calhoun, Vince D.; Lee, Hyo Jong; Toga, Arthur W.; McEwen, Sarah; Ford, Judith M.; Mathalon, Daniel H.; Diaz, Michele; O’Leary, Daniel S.; Bockholt, H. Jeremy; Gadde, Syam; Preda, Adrian; Wible, Cynthia G.; Stern, Hal S.; Belger, Aysenil; McCarthy, Gregory; Ozyurt, Burak; Potkin, Steven G.

    2015-01-01

    The Function Biomedical Informatics Research Network (FBIRN) developed methods and tools for conducting multi-scanner functional magnetic resonance imaging (fMRI) studies. Method and tool development were based on two major goals: 1) to assess the major sources of variation in fMRI studies conducted across scanners, including instrumentation, acquisition protocols, challenge tasks, and analysis methods, and 2) to provide a distributed network infrastructure and an associated federated database to host and query large, multi-site, fMRI and clinical datasets. In the process of achieving these goals the FBIRN test bed generated several multi-scanner brain imaging data sets to be shared with the wider scientific community via the BIRN Data Repository (BDR). The FBIRN Phase 1 dataset consists of a traveling subject study of 5 healthy subjects, each scanned on 10 different 1.5 to 4 Tesla scanners. The FBIRN Phase 2 and Phase 3 datasets consist of subjects with schizophrenia or schizoaffective disorder along with healthy comparison subjects scanned at multiple sites. In this paper, we provide concise descriptions of FBIRN’s multi-scanner brain imaging data sets and details about the BIRN Data Repository instance of the Human Imaging Database (HID) used to publicly share the data. PMID:26364863

  12. HepSEQ: International Public Health Repository for Hepatitis B

    PubMed Central

    Gnaneshan, Saravanamuttu; Ijaz, Samreen; Moran, Joanne; Ramsay, Mary; Green, Jonathan

    2007-01-01

    HepSEQ is a repository for an extensive library of public health and molecular data relating to hepatitis B virus (HBV) infection collected from international sources. It is hosted by the Centre for Infections, Health Protection Agency (HPA), England, United Kingdom. This repository has been developed as a web-enabled, quality-controlled database to act as a tool for surveillance, HBV case management and for research. The web front-end for the database system can be accessed from . The format of the database system allows for comprehensive molecular, clinical and epidemiological data to be deposited into a functional database, to search and manipulate the stored data and to extract and visualize the information on epidemiological, virological, clinical, nucleotide sequence and mutational aspects of HBV infection through web front-end. Specific tools, built into the database, can be utilized to analyse deposited data and provide information on HBV genotype, identify mutations with known clinical significance (e.g. vaccine escape, precore and antiviral-resistant mutations) and carry out sequence homology searches against other deposited strains. Further mechanisms are also in place to allow specific tailored searches of the database to be undertaken. PMID:17130143

  13. mdFoam+: Advanced molecular dynamics in OpenFOAM

    NASA Astrophysics Data System (ADS)

    Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.

  14. Creativity and Mobile Language Learning Using LingoBee

    ERIC Educational Resources Information Center

    Petersen, Sobah Abbas; Procter-Legg, Emma; Cacchione, Annamaria

    2013-01-01

    In this paper, the authors explore the ideas of mobility and creativity through the use of LingoBee, a mobile app for situated language learning. LingoBee is based on ideas from crowd-sourcing and social networking to support language learners. Learners are able to create their own content and share it with other learners through a repository. The…

  15. Synthesis of lower treeline limber pine (Pinus flexilis) woodland knowledge, research needs, and management considerations

    Treesearch

    Robert E. Means

    2011-01-01

    Lower treeline limber pine woodlands have received little attention in peer-reviewed literature and in management strategies. These ecologically distinct systems are thought to be seed repositories between discontinuous populations in the northern and central Rocky Mountains, serving as seed sources for bird dispersal between distinct mountain ranges. Their position on...

  16. Heterogeneous redox conditions, arsenic mobility, and groundwater flow in a fractured-rock aquifer near a waste repository site in New Hampshire, USA

    EPA Science Inventory

    Anthropogenic sources of carbon from landfill or waste leachate can promote reductive dissolution of in situ arsenic (As) and enhance the mobility of As in groundwater. Groundwater from residential-supply wells in a fractured crystalline-rock aquifer adjacent to a Superfund site ...

  17. ScrubChem: Cleaning of PubChem Bioassay Data to Create Diverse and Massive Bioactivity Datasets for Use in Modeling Applications (SOT)

    EPA Science Inventory

    The PubChem Bioassay database is a non-curated public repository with bioactivity data from 64 sources, including: ChEMBL, BindingDb, DrugBank, Tox21, NIH Molecular Libraries Screening Program, and various academic, government, and industrial contributors. However, this data is d...

  18. DASMiner: discovering and integrating data from DAS sources

    PubMed Central

    2009-01-01

    Background DAS is a widely adopted protocol for providing syntactic interoperability among biological databases. The popularity of DAS is due to a simplified and elegant mechanism for data exchange that consists of sources exposing their RESTful interfaces for data access. As a growing number of DAS services are available for molecular biology resources, there is an incentive to explore this protocol in order to advance data discovery and integration among these resources. Results We developed DASMiner, a Matlab toolkit for querying DAS data sources that enables creation of integrated biological models using the information available in DAS-compliant repositories. DASMiner is composed by a browser application and an API that work together to facilitate gathering of data from different DAS sources, which can be used for creating enriched datasets from multiple sources. The browser is used to formulate queries and navigate data contained in DAS sources. Users can execute queries against these sources in an intuitive fashion, without the need of knowing the specific DAS syntax for the particular source. Using the source's metadata provided by the DAS Registry, the browser's layout adapts to expose only the set of commands and coordinate systems supported by the specific source. For this reason, the browser can interrogate any DAS source, independently of the type of data being served. The API component of DASMiner may be used for programmatic access of DAS sources by programs in Matlab. Once the desired data is found during navigation, the query is exported in the format of an API call to be used within any Matlab application. We illustrate the use of DASMiner by creating integrative models of histone modification maps and protein-protein interaction networks. These enriched datasets were built by retrieving and integrating distributed genomic and proteomic DAS sources using the API. Conclusion The support of the DAS protocol allows that hundreds of molecular biology databases to be treated as a federated, online collection of resources. DASMiner enables full exploration of these resources, and can be used to deploy applications and create integrated views of biological systems using the information deposited in DAS repositories. PMID:19919683

  19. The NCAR Digital Asset Services Hub (DASH): Implementing Unified Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Stott, D.; Worley, S. J.; Hou, C. Y.; Nienhouse, E.

    2017-12-01

    The National Center for Atmospheric Research (NCAR) Directorate created the Data Stewardship Engineering Team (DSET) to plan and implement an integrated single entry point for uniform digital asset discovery and access across the organization in order to improve the efficiency of access, reduce the costs, and establish the foundation for interoperability with other federated systems. This effort supports new policies included in federal funding mandates, NSF data management requirements, and journal citation recommendations. An inventory during the early planning stage identified diverse asset types across the organization that included publications, datasets, metadata, models, images, and software tools and code. The NCAR Digital Asset Services Hub (DASH) is being developed and phased in this year to improve the quality of users' experiences in finding and using these assets. DASH serves to provide engagement, training, search, and support through the following four nodes (see figure). DASH MetadataDASH provides resources for creating and cataloging metadata to the NCAR Dialect, a subset of ISO 19115. NMDEdit, an editor based on a European open source application, has been configured for manual entry of NCAR metadata. CKAN, an open source data portal platform, harvests these XML records (along with records output directly from databases) from a Web Accessible Folder (WAF) on GitHub for validation. DASH SearchThe NCAR Dialect metadata drives cross-organization search and discovery through CKAN, which provides the display interface of search results. DASH search will establish interoperability by facilitating metadata sharing with other federated systems. DASH ConsultingThe DASH Data Curation & Stewardship Coordinator assists with Data Management (DM) Plan preparation and advises on Digital Object Identifiers. The coordinator arranges training sessions on the DASH metadata tools and DM planning, and provides one-on-one assistance as requested. DASH RepositoryA repository is under development for NCAR datasets currently not in existing lab-managed archives. The DASH repository will be under NCAR governance and meet Trustworthy Repositories Audit & Certification (TRAC) requirements. This poster will highlight the processes, lessons learned, and current status of the DASH effort at NCAR.

  20. Generating a focused view of disease ontology cancer terms for pan-cancer data integration and analysis

    PubMed Central

    Wu, Tsung-Jung; Schriml, Lynn M.; Chen, Qing-Rong; Colbert, Maureen; Crichton, Daniel J.; Finney, Richard; Hu, Ying; Kibbe, Warren A.; Kincaid, Heather; Meerzaman, Daoud; Mitraka, Elvira; Pan, Yang; Smith, Krista M.; Srivastava, Sudhir; Ward, Sari; Yan, Cheng; Mazumder, Raja

    2015-01-01

    Bio-ontologies provide terminologies for the scientific community to describe biomedical entities in a standardized manner. There are multiple initiatives that are developing biomedical terminologies for the purpose of providing better annotation, data integration and mining capabilities. Terminology resources devised for multiple purposes inherently diverge in content and structure. A major issue of biomedical data integration is the development of overlapping terms, ambiguous classifications and inconsistencies represented across databases and publications. The disease ontology (DO) was developed over the past decade to address data integration, standardization and annotation issues for human disease data. We have established a DO cancer project to be a focused view of cancer terms within the DO. The DO cancer project mapped 386 cancer terms from the Catalogue of Somatic Mutations in Cancer (COSMIC), The Cancer Genome Atlas (TCGA), International Cancer Genome Consortium, Therapeutically Applicable Research to Generate Effective Treatments, Integrative Oncogenomics and the Early Detection Research Network into a cohesive set of 187 DO terms represented by 63 top-level DO cancer terms. For example, the COSMIC term ‘kidney, NS, carcinoma, clear_cell_renal_cell_carcinoma’ and TCGA term ‘Kidney renal clear cell carcinoma’ were both grouped to the term ‘Disease Ontology Identification (DOID):4467 / renal clear cell carcinoma’ which was mapped to the TopNodes_DOcancerslim term ‘DOID:263 / kidney cancer’. Mapping of diverse cancer terms to DO and the use of top level terms (DO slims) will enable pan-cancer analysis across datasets generated from any of the cancer term sources where pan-cancer means including or relating to all or multiple types of cancer. The terms can be browsed from the DO web site (http://www.disease-ontology.org) and downloaded from the DO’s Apache Subversion or GitHub repositories. Database URL: http://www.disease-ontology.org PMID:25841438

  1. Open-source tools for data mining.

    PubMed

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  2. Salmonid Gamete Preservation in the Snake River Basin, Annual Report 2002.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, William; Kucera, Paul

    2003-07-01

    In spite of an intensive management effort, chinook salmon (Oncorhynchus tshawytscha) and steelhead (Oncorhynchus mykiss) populations in the Northwest have not recovered and are currently listed as threatened species under the Endangered Species Act. In addition to the loss of diversity from stocks that have already gone extinct, decreased genetic diversity resulting from genetic drift and inbreeding is a major concern. Reduced population and genetic variability diminishes the environmental adaptability of individual species and entire ecological communities. The Nez Perce Tribe (NPT), in cooperation with Washington State University and the University of Idaho, established a germplasm repository in 1992 inmore » order to preserve the remaining salmonid diversity in the region. The germplasm repository provides long-term storage for cryopreserved gametes. Although only male gametes can be cryopreserved, conserving the male component of genetic diversity will maintain future management options for species recovery. NPT efforts have focused on preserving salmon and steelhead gametes from the major river subbasins in the Snake River basin. However, the repository is available for all management agencies to contribute gamete samples from other regions and species. In 2002 a total of 570 viable semen samples were added to the germplasm repository. This included the gametes of 287 chinook salmon from the Lostine River, Catherine Creek, upper Grande Ronde River, Imnaha River (Lookingglass Hatchery), Lake Creek, South Fork Salmon River, Johnson Creek, Big Creek, Capehorn Creek, Marsh Creek, Pahsimeroi River (Pahsimeroi Hatchery), and upper Salmon River (Sawtooth Hatchery) and the gametes of 280 steelhead from the North Fork Clearwater River (Dworshak Hatchery), Fish Creek, Little Sheep Creek, Pahsimeroi River (Pahsimeroi Hatchery) and Snake River (Oxbow Hatchery). In addition, gametes from 60 Yakima River spring chinook and 34 Wenatchee River coho salmon were added to the repository by Washington Department of Fish and Wildlife and Columbia River Intertribal Fish Commission, respectively. To date, a total of 3,928 Columbia River salmon and steelhead gamete samples and three Kootenai River white sturgeon are preserved in the repository. Samples are stored in independent locations at the University of Idaho (UI) and Washington State University (WSU).« less

  3. The Belgian repository of fundamental atomic data and stellar spectra (BRASS). I. Cross-matching atomic databases of astrophysical interest

    NASA Astrophysics Data System (ADS)

    Laverick, M.; Lobel, A.; Merle, T.; Royer, P.; Martayan, C.; David, M.; Hensberge, H.; Thienpont, E.

    2018-04-01

    Context. Fundamental atomic parameters, such as oscillator strengths, play a key role in modelling and understanding the chemical composition of stars in the Universe. Despite the significant work underway to produce these parameters for many astrophysically important ions, uncertainties in these parameters remain large and can propagate throughout the entire field of astronomy. Aims: The Belgian repository of fundamental atomic data and stellar spectra (BRASS) aims to provide the largest systematic and homogeneous quality assessment of atomic data to date in terms of wavelength, atomic and stellar parameter coverage. To prepare for it, we first compiled multiple literature occurrences of many individual atomic transitions, from several atomic databases of astrophysical interest, and assessed their agreement. In a second step synthetic spectra will be compared against extremely high-quality observed spectra, for a large number of BAFGK spectral type stars, in order to critically evaluate the atomic data of a large number of important stellar lines. Methods: Several atomic repositories were searched and their data retrieved and formatted in a consistent manner. Data entries from all repositories were cross-matched against our initial BRASS atomic line list to find multiple occurrences of the same transition. Where possible we used a new non-parametric cross-match depending only on electronic configurations and total angular momentum values. We also checked for duplicate entries of the same physical transition, within each retrieved repository, using the non-parametric cross-match. Results: We report on the number of cross-matched transitions for each repository and compare their fundamental atomic parameters. We find differences in log(gf) values of up to 2 dex or more. We also find and report that 2% of our line list and Vienna atomic line database retrievals are composed of duplicate transitions. Finally we provide a number of examples of atomic spectral lines with different retrieved literature log(gf) values, and discuss the impact of these uncertain log(gf) values on quantitative spectroscopy. All cross-matched atomic data and duplicate transition pairs are available to download at http://brass.sdf.org

  4. The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories

    NASA Astrophysics Data System (ADS)

    Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.

    2017-12-01

    SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R

  5. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  6. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  7. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  8. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  9. Improved data retrieval from TreeBASE via taxonomic and linguistic data enrichment

    PubMed Central

    Anwar, Nadia; Hunt, Ela

    2009-01-01

    Background TreeBASE, the only data repository for phylogenetic studies, is not being used effectively since it does not meet the taxonomic data retrieval requirements of the systematics community. We show, through an examination of the queries performed on TreeBASE, that data retrieval using taxon names is unsatisfactory. Results We report on a new wrapper supporting taxon queries on TreeBASE by utilising a Taxonomy and Classification Database (TCl-Db) we created. TCl-Db holds merged and consolidated taxonomic names from multiple data sources and can be used to translate hierarchical, vernacular and synonym queries into specific query terms in TreeBASE. The query expansion supported by TCl-Db shows very significant information retrieval quality improvement. The wrapper can be accessed at the URL The methodology we developed is scalable and can be applied to new data, as those become available in the future. Conclusion Significantly improved data retrieval quality is shown for all queries, and additional flexibility is achieved via user-driven taxonomy selection. PMID:19426482

  10. iAnn: an event sharing platform for the life sciences.

    PubMed

    Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel

    2013-08-01

    We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.

  11. An infrastructure for ontology-based information systems in biomedicine: RICORDO case study.

    PubMed

    Wimalaratne, Sarala M; Grenon, Pierre; Hoehndorf, Robert; Gkoutos, Georgios V; de Bono, Bernard

    2012-02-01

    The article presents an infrastructure for supporting the semantic interoperability of biomedical resources based on the management (storing and inference-based querying) of their ontology-based annotations. This infrastructure consists of: (i) a repository to store and query ontology-based annotations; (ii) a knowledge base server with an inference engine to support the storage of and reasoning over ontologies used in the annotation of resources; (iii) a set of applications and services allowing interaction with the integrated repository and knowledge base. The infrastructure is being prototyped and developed and evaluated by the RICORDO project in support of the knowledge management of biomedical resources, including physiology and pharmacology models and associated clinical data. The RICORDO toolkit and its source code are freely available from http://ricordo.eu/relevant-resources. sarala@ebi.ac.uk.

  12. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    Treesearch

    Ryan J. Longman; Thomas W. Giambelluca; Michael A. Nullet; Abby G. Frazier; Kevin Kodama; Shelley D. Crausbay; Paul D. Krushelnycky; Susan Cordell; Martyn P. Clark; Andy J. Newman; Jeffrey R. Arnold

    2018-01-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai‘i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National...

  13. Using a Combination of UML, C2RM, XML, and Metadata Registries to Support Long-Term Development/Engineering

    DTIC Science & Technology

    2003-01-01

    Authenticat’n (XCBF) Authorizat’n (XACML) (SAML) Privacy (P3P) Digital Rights Management (XrML) Content Mngmnt (DASL) (WebDAV) Content Syndicat’n...Registry/ Repository BPSS eCommerce XML/EDI Universal Business Language (UBL) Internet & Computing Human Resources (HR-XML) Semantic KEY XML SPECIFICATIONS

  14. Applying probabilistic temporal and multisite data quality control methods to a public health mortality registry in Spain: a systematic approach to quality control of repositories.

    PubMed

    Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M

    2016-11-01

    To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Utah Heavy Oil Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Bauman; S. Burian; M. Deo

    The Utah Heavy Oil Program (UHOP) was established in June 2006 to provide multidisciplinary research support to federal and state constituents for addressing the wide-ranging issues surrounding the creation of an industry for unconventional oil production in the United States. Additionally, UHOP was to serve as an on-going source of unbiased information to the nation surrounding technical, economic, legal and environmental aspects of developing heavy oil, oil sands, and oil shale resources. UHOP fulGilled its role by completing three tasks. First, in response to the Energy Policy Act of 2005 Section 369(p), UHOP published an update report to the 1987more » technical and economic assessment of domestic heavy oil resources that was prepared by the Interstate Oil and Gas Compact Commission. The UHOP report, entitled 'A Technical, Economic, and Legal Assessment of North American Heavy Oil, Oil Sands, and Oil Shale Resources' was published in electronic and hard copy form in October 2007. Second, UHOP developed of a comprehensive, publicly accessible online repository of unconventional oil resources in North America based on the DSpace software platform. An interactive map was also developed as a source of geospatial information and as a means to interact with the repository from a geospatial setting. All documents uploaded to the repository are fully searchable by author, title, and keywords. Third, UHOP sponsored Give research projects related to unconventional fuels development. Two projects looked at issues associated with oil shale production, including oil shale pyrolysis kinetics, resource heterogeneity, and reservoir simulation. One project evaluated in situ production from Utah oil sands. Another project focused on water availability and produced water treatments. The last project considered commercial oil shale leasing from a policy, environmental, and economic perspective.« less

  16. Micromechanical processes in consolidated granular salt

    DOE PAGES

    Mills, Melissa Marie; Stormont, John C.; Bauer, Stephen J.

    2018-03-27

    Here, granular salt is likely to be used as backfill material and a seal system component within geologic salt formations serving as a repository for long-term isolation of nuclear waste. Pressure from closure of the surrounding salt formation will promote consolidation of granular salt, eventually resulting in properties comparable to native salt. Understanding dependence of consolidation processes on stress state, moisture availability, temperature, and time is important for demonstrating sealing functions and long-term repository performance. This study characterizes laboratory-consolidated granular salt by means of microstructural observations. Granular salt material from mining operations was obtained from the bedded Salado Formation hostingmore » the Waste Isolation Pilot Plant and the Avery Island salt dome. Laboratory test conditions included hydrostatic consolidation of jacketed granular salt with varying conditions of confining isochoric stress to 38 MPa, temperature to 250 °C, moisture additions of 1% by weight, time duration, and vented and non-vented states. Resultant porosities ranged between 1% and 22%. Optical and scanning electron microscopic techniques were used to ascertain consolidation mechanisms. From these investigations, samples with 1% added moisture or unvented during consolidation, exhibit clear pressure solution processes with tightly cohered grain boundaries and occluded fluid pores. Samples with only natural moisture content consolidated by a combination of brittle, cataclastic, and crystal plastic deformation. Recrystallization at 250 °C irrespective of moisture conditions was also observed. The range and variability of conditions applied in this study, combined with the techniques used to display microstructural features, are unique, and provide insight into an important area of governing deformation mechanism(s) occurring within salt repository applications.« less

  17. Micromechanical processes in consolidated granular salt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Melissa Marie; Stormont, John C.; Bauer, Stephen J.

    Here, granular salt is likely to be used as backfill material and a seal system component within geologic salt formations serving as a repository for long-term isolation of nuclear waste. Pressure from closure of the surrounding salt formation will promote consolidation of granular salt, eventually resulting in properties comparable to native salt. Understanding dependence of consolidation processes on stress state, moisture availability, temperature, and time is important for demonstrating sealing functions and long-term repository performance. This study characterizes laboratory-consolidated granular salt by means of microstructural observations. Granular salt material from mining operations was obtained from the bedded Salado Formation hostingmore » the Waste Isolation Pilot Plant and the Avery Island salt dome. Laboratory test conditions included hydrostatic consolidation of jacketed granular salt with varying conditions of confining isochoric stress to 38 MPa, temperature to 250 °C, moisture additions of 1% by weight, time duration, and vented and non-vented states. Resultant porosities ranged between 1% and 22%. Optical and scanning electron microscopic techniques were used to ascertain consolidation mechanisms. From these investigations, samples with 1% added moisture or unvented during consolidation, exhibit clear pressure solution processes with tightly cohered grain boundaries and occluded fluid pores. Samples with only natural moisture content consolidated by a combination of brittle, cataclastic, and crystal plastic deformation. Recrystallization at 250 °C irrespective of moisture conditions was also observed. The range and variability of conditions applied in this study, combined with the techniques used to display microstructural features, are unique, and provide insight into an important area of governing deformation mechanism(s) occurring within salt repository applications.« less

  18. Data Safe Havens in health research and healthcare.

    PubMed

    Burton, Paul R; Murtagh, Madeleine J; Boyd, Andy; Williams, James B; Dove, Edward S; Wallace, Susan E; Tassé, Anne-Marie; Little, Julian; Chisholm, Rex L; Gaye, Amadou; Hveem, Kristian; Brookes, Anthony J; Goodwin, Pat; Fistein, Jon; Bobrow, Martin; Knoppers, Bartha M

    2015-10-15

    The data that put the 'evidence' into 'evidence-based medicine' are central to developments in public health, primary and hospital care. A fundamental challenge is to site such data in repositories that can easily be accessed under appropriate technical and governance controls which are effectively audited and are viewed as trustworthy by diverse stakeholders. This demands socio-technical solutions that may easily become enmeshed in protracted debate and controversy as they encounter the norms, values, expectations and concerns of diverse stakeholders. In this context, the development of what are called 'Data Safe Havens' has been crucial. Unfortunately, the origins and evolution of the term have led to a range of different definitions being assumed by different groups. There is, however, an intuitively meaningful interpretation that is often assumed by those who have not previously encountered the term: a repository in which useful but potentially sensitive data may be kept securely under governance and informatics systems that are fit-for-purpose and appropriately tailored to the nature of the data being maintained, and may be accessed and utilized by legitimate users undertaking work and research contributing to biomedicine, health and/or to ongoing development of healthcare systems. This review explores a fundamental question: 'what are the specific criteria that ought reasonably to be met by a data repository if it is to be seen as consistent with this interpretation and viewed as worthy of being accorded the status of 'Data Safe Haven' by key stakeholders'? We propose 12 such criteria. paul.burton@bristol.ac.uk. © The Author 2015. Published by Oxford University Press.

  19. Data Safe Havens in health research and healthcare

    PubMed Central

    Burton, Paul R.; Murtagh, Madeleine J.; Boyd, Andy; Williams, James B.; Dove, Edward S.; Wallace, Susan E.; Tassé, Anne-Marie; Little, Julian; Chisholm, Rex L.; Gaye, Amadou; Hveem, Kristian; Brookes, Anthony J.; Goodwin, Pat; Fistein, Jon; Bobrow, Martin; Knoppers, Bartha M.

    2015-01-01

    Motivation: The data that put the ‘evidence’ into ‘evidence-based medicine’ are central to developments in public health, primary and hospital care. A fundamental challenge is to site such data in repositories that can easily be accessed under appropriate technical and governance controls which are effectively audited and are viewed as trustworthy by diverse stakeholders. This demands socio-technical solutions that may easily become enmeshed in protracted debate and controversy as they encounter the norms, values, expectations and concerns of diverse stakeholders. In this context, the development of what are called ‘Data Safe Havens’ has been crucial. Unfortunately, the origins and evolution of the term have led to a range of different definitions being assumed by different groups. There is, however, an intuitively meaningful interpretation that is often assumed by those who have not previously encountered the term: a repository in which useful but potentially sensitive data may be kept securely under governance and informatics systems that are fit-for-purpose and appropriately tailored to the nature of the data being maintained, and may be accessed and utilized by legitimate users undertaking work and research contributing to biomedicine, health and/or to ongoing development of healthcare systems. Results: This review explores a fundamental question: ‘what are the specific criteria that ought reasonably to be met by a data repository if it is to be seen as consistent with this interpretation and viewed as worthy of being accorded the status of ‘Data Safe Haven’ by key stakeholders’? We propose 12 such criteria. Contact: paul.burton@bristol.ac.uk PMID:26112289

  20. Providing Multi-Page Data Extraction Services with XWRAPComposer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ling; Zhang, Jianjun; Han, Wei

    2008-04-30

    Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  1. 40 CFR 124.33 - Information repository.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Information repository. 124.33 Section... FOR DECISIONMAKING Specific Procedures Applicable to RCRA Permits § 124.33 Information repository. (a... basis, for an information repository. When assessing the need for an information repository, the...

  2. 10 CFR 60.130 - General considerations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.130 General... for a high-level radioactive waste repository at a geologic repository operations area, and an... geologic repository operations area, must include the principal design criteria for a proposed facility...

  3. Utility of the Department of Defense Serum Repository in Assessing Deployment Exposure.

    PubMed

    Lushniak, Boris; Mallon, Col Timothy M; Gaydos, Joel C; Smith, David J

    2016-08-01

    This paper describes why the research project was conducted in terms of demonstrating the utility of the Department of Defense Serum Repository in addressing deployment environmental exposures. The history deployment exposure surveillance was reviewed and the rationale for developing validated biomarkers that were detected in sera in postdeployment samples and compared with nondeployed controls was described. The goal was to find validated biomarkers that are associated with both exposures and health outcomes. The articles in this supplement described novel serum biomarkers that were found to be associated with deployment exposures and weakly associated with some health outcomes. Future research must continue to validate the use of serum biomarkers when operational contingencies prevent the gold standard collection of real-time breathing zone samples in deployed service members.

  4. Forecasting turning trends in knowledge networks

    NASA Astrophysics Data System (ADS)

    Szántó-Várnagy, Ádám; Farkas, Illés J.

    2018-10-01

    A large portion of our collective human knowledge is in electronic repositories. These repositories range from "hard fact" databases (e.g., scientific publications and patents) to "soft" knowledge such as news portals. The common denominator between them all is that they can be thought of in terms of topics/keywords. The interest in these topics is constantly changing over time. Their frequency occurrence diagrams can be used for effective prediction by the most straightforward simplification. In this paper, we use these diagrams to produce simple and human-readable rules that are able to predict the future trends of the most important keywords in 5 data sets of different types. A thorough analysis of the necessary input variables and parameters and their relation to the success rate is presented, as well.

  5. 48 CFR 227.7108 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Technical Data 227.7108 Contractor data repositories. (a) Contractor data repositories may be established... procedures for protecting technical data delivered to or stored at the repository from unauthorized release... disclosure of technical data from the repository to third parties consistent with the Government's rights in...

  6. High Res at High Speed: Automated Delivery of High-Resolution Images from Digital Library Collections

    ERIC Educational Resources Information Center

    Westbrook, R. Niccole; Watkins, Sean

    2012-01-01

    As primary source materials in the library are digitized and made available online, the focus of related library services is shifting to include new and innovative methods of digital delivery via social media, digital storytelling, and community-based and consortial image repositories. Most images on the Web are not of sufficient quality for most…

  7. A method and software framework for enriching private biomedical sources with data from public online repositories.

    PubMed

    Anguita, Alberto; García-Remesal, Miguel; Graf, Norbert; Maojo, Victor

    2016-04-01

    Modern biomedical research relies on the semantic integration of heterogeneous data sources to find data correlations. Researchers access multiple datasets of disparate origin, and identify elements-e.g. genes, compounds, pathways-that lead to interesting correlations. Normally, they must refer to additional public databases in order to enrich the information about the identified entities-e.g. scientific literature, published clinical trial results, etc. While semantic integration techniques have traditionally focused on providing homogeneous access to private datasets-thus helping automate the first part of the research, and there exist different solutions for browsing public data, there is still a need for tools that facilitate merging public repositories with private datasets. This paper presents a framework that automatically locates public data of interest to the researcher and semantically integrates it with existing private datasets. The framework has been designed as an extension of traditional data integration systems, and has been validated with an existing data integration platform from a European research project by integrating a private biological dataset with data from the National Center for Biotechnology Information (NCBI). Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Accelerated Weathering of Fluidized Bed Steam Reformation Material Under Hydraulically Unsaturated Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, Eric M.

    2007-09-16

    To predict the long-term fate of low- and high-level waste forms in the subsurface over geologic time scales, it is important to understand the behavior of the corroding waste forms under conditions the mimic to the open flow and transport properties of a subsurface repository. Fluidized bed steam reformation (FBSR), a supplemental treatment technology option, is being considered as a waste form for the immobilization of low-activity tank waste. To obtain the fundamental information needed to evaluate the behavior of the FBSR waste form under repository relevant conditions and to monitor the long-term behavior of this material, an accelerated weatheringmore » experiment is being conducted with the pressurized unsaturated flow (PUF) apparatus. Unlike other accelerated weathering test methods (product consistency test, vapor hydration test, and drip test), PUF experiments are conducted under hydraulically unsaturated conditions. These experiments are unique because they mimic the vadose zone environment and allow the corroding waste form to achieve its final reaction state. Results from this on-going experiment suggest the volumetric water content varied as a function of time and reached steady state after 160 days of testing. Unlike the volumetric water content, periodic excursions in the solution pH and electrical conductivity have been occurring consistently during the test. Release of elements from the column illustrates a general trend of decreasing concentration with increasing reaction time. Normalized concentrations of K, Na, P, Re (a chemical analogue for 99Tc), and S are as much as 1 × 104 times greater than Al, Cr, Si, and Ti. After more than 600 days of testing, the solution chemistry data collected to-date illustrate the importance of understanding the long-term behavior of the FBSR product under conditions that mimic the open flow and transport properties of a subsurface repository.« less

  9. Multi-dimensional transport modelling of corrosive agents through a bentonite buffer in a Canadian deep geological repository.

    PubMed

    Briggs, Scott; McKelvie, Jennifer; Sleep, Brent; Krol, Magdalena

    2017-12-01

    The use of a deep geological repository (DGR) for the long-term disposal of used nuclear fuel is an approach currently being investigated by several agencies worldwide, including Canada's Nuclear Waste Management Organization (NWMO). Within the DGR, used nuclear fuel will be placed in copper-coated steel containers and surrounded by a bentonite clay buffer. While copper is generally thermodynamically stable, corrosion can occur due to the presence of sulphide under anaerobic conditions. As such, understanding transport of sulphide through the engineered barrier system to the used fuel container is an important consideration in DGR design. In this study, a three-dimensional (3D) model of sulphide transport in a DGR was developed. The numerical model is implemented using COMSOL Multiphysics, a commercial finite element software package. Previous sulphide transport models of the NWMO repository used a simplified one-dimensional system. This work illustrates the importance of 3D modelling to capture non-uniform effects, as results showed locations of maximum sulphide flux are 1.7 times higher than the average flux to the used fuel container. Copyright © 2017. Published by Elsevier B.V.

  10. Storing, Browsing, Querying, and Sharing Data: the THREDDS Data Repository (TDR)

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D.; Baltzer, T.

    2005-12-01

    The Unidata Internet Data Distribution (IDD) network delivers gigabytes of data per day in near real time to sites across the U.S. and beyond. The THREDDS Data Server (TDS) supports public browsing of metadata and data access via OPeNDAP enabled URLs for datasets such as these. With such large quantities of data, sites generally employ a simple data management policy, keeping the data for a relatively short term on the order of hours to perhaps a week or two. In order to save interesting data in longer term storage and make it available for sharing, a user must move the data herself. In this case the user is responsible for determining where space is available, executing the data movement, generating any desired metadata, and setting access control to enable sharing. This task sequence is generally based on execution of a sequence of low level operating system specific commands with significant user involvement. The LEAD (Linked Environments for Atmospheric Discovery) project is building a cyberinfrastructure to support research and education in mesoscale meteorology. LEAD orchestrations require large, robust, and reliable storage with speedy access to stage data and store both intermediate and final results. These requirements suggest storage solutions that involve distributed storage, replication, and interfacing to archival storage systems such as mass storage systems and tape or removable disks. LEAD requirements also include metadata generation and access in order to support querying. In support of both THREDDS and LEAD requirements, Unidata is designing and prototyping the THREDDS Data Repository (TDR), a framework for a modular data repository to support distributed data storage and retrieval using a variety of back end storage media and interchangeable software components. The TDR interface will provide high level abstractions for long term storage, controlled, fast and reliable access, and data movement capabilities via a variety of technologies such as OPeNDAP and gridftp. The modular structure will allow substitution of software components so that both simple and complex storage media can be integrated into the repository. It will also allow integration of different varieties of supporting software. For example, if replication is desired, replica management could be handled via a simple hash table or a complex solution such as Replica Locater Service (RLS). In order to ensure that metadata is available for all the data in the repository, the TDR will also generate THREDDS metadata when necessary. Users will be able to establish levels of access control to their metadata and data. Coupled with a THREDDS Data Server, both browsing via THREDDS catalogs and querying capabilities will be supported. This presentation will describe the motivating factors, current status, and future plans of the TDR. References: IDD: http://www.unidata.ucar.edu/content/software/idd/index.html THREDDS: http://www.unidata.ucar.edu/content/projects/THREDDS/tech/server/ServerStatus.html LEAD: http://lead.ou.edu/ RLS: http://www.isi.edu/~annc/papers/chervenakRLSjournal05.pdf

  11. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  12. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  13. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Swap data repository... COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of...

  14. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... geologic repository operations area. 63.112 Section 63.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical... repository operations area. The preclosure safety analysis of the geologic repository operations area must...

  15. Managing and Evaluating Digital Repositories

    ERIC Educational Resources Information Center

    Zuccala, Alesia; Oppenheim, Charles; Dhiensa, Rajveen

    2008-01-01

    Introduction: We examine the role of the digital repository manager, discuss the future of repository management and evaluation and suggest that library and information science schools develop new repository management curricula. Method: Face-to-face interviews were carried out with managers of five different types of repositories and a Web-based…

  16. jPOSTrepo: an international standard data repository for proteomes

    PubMed Central

    Okuda, Shujiro; Watanabe, Yu; Moriya, Yuki; Kawano, Shin; Yamamoto, Tadashi; Matsumoto, Masaki; Takami, Tomoyo; Kobayashi, Daiki; Araki, Norie; Yoshizawa, Akiyasu C.; Tabata, Tsuyoshi; Sugiyama, Naoyuki; Goto, Susumu; Ishihama, Yasushi

    2017-01-01

    Major advancements have recently been made in mass spectrometry-based proteomics, yielding an increasing number of datasets from various proteomics projects worldwide. In order to facilitate the sharing and reuse of promising datasets, it is important to construct appropriate, high-quality public data repositories. jPOSTrepo (https://repository.jpostdb.org/) has successfully implemented several unique features, including high-speed file uploading, flexible file management and easy-to-use interfaces. This repository has been launched as a public repository containing various proteomic datasets and is available for researchers worldwide. In addition, our repository has joined the ProteomeXchange consortium, which includes the most popular public repositories such as PRIDE in Europe for MS/MS datasets and PASSEL for SRM datasets in the USA. Later MassIVE was introduced in the USA and accepted into the ProteomeXchange, as was our repository in July 2016, providing important datasets from Asia/Oceania. Accordingly, this repository thus contributes to a global alliance to share and store all datasets from a wide variety of proteomics experiments. Thus, the repository is expected to become a major repository, particularly for data collected in the Asia/Oceania region. PMID:27899654

  17. Overview of groundwater quality in the Piceance Basin, western Colorado, 1946--2009

    USGS Publications Warehouse

    Thomas, J.C.; McMahon, P.B.

    2013-01-01

    Groundwater-quality data from public and private sources for the period 1946 to 2009 were compiled and put into a common data repository for the Piceance Basin. The data repository is available on the web at http://rmgsc.cr.usgs.gov/cwqdr/Piceance/index.shtml. A subset of groundwater-quality data from the repository was compiled, reviewed, and checked for quality assurance for this report. The resulting dataset consists of the most recently collected sample from 1,545 wells, 1,007 (65 percent) of which were domestic wells. From those samples, the following constituents were selected for presentation in this report: dissolved oxygen, dissolved solids, pH, major ions (chloride, sulfate, fluoride), trace elements (arsenic, barium, iron, manganese, selenium), nitrate, benzene, toluene, ethylbenzene, xylene, methane, and the stable isotopic compositions of water and methane. Some portion of recharge to most of the wells for which data were available was derived from precipitation (most likely snowmelt), as indicated by δ2H [H2O] and δ18O[H2O] values that plot along the Global Meteoric Water Line and near the values for snow samples collected in the study area. Ninety-three percent of the samples were oxic, on the basis of concentrations of dissolved oxygen that were greater than or equal to 0.5 milligrams per liter. Concentration data were compared with primary and secondary drinking-water standards established by the U.S. Environmental Protection Agency. Constituents that exceeded the primary standards were arsenic (13 percent), selenium (9.2 percent), fluoride (8.4 percent), barium (4.1 percent), nitrate (1.6 percent), and benzene (0.6 percent). Concentrations of toluene, xylenes, and ethylbenzene did not exceed standards in any samples. Constituents that exceeded the secondary standard were dissolved solids (72 percent), sulfate (37 percent), manganese (21 percent), iron (16 percent), and chloride (10 percent). Drinking-water standards have not been established for methane, which was detected in 24 percent of samples. Methane concentrations were greater than or equal to 1 milligram per liter in 8.5 percent of samples. Methane isotopic data for samples collected primarily from domestic wells in Garfield County indicate that methane in samples with relative high methane concentrations were derived from both biogenic and thermogenic sources. Many of the constituents that exceeded standards, such as arsenic, fluoride, iron, and manganese, were derived from rock and sediment in aquifers. Elevated nitrate concentrations were most likely derived from human sources such as fertilizer and human or animal waste. Information about the geologic unit or aquifer in which a well was completed generally was not provided by data sources. However, limited data indicate that Quaternary deposits in Garfield and Mesa Counties, the Wasatch Formation in Garfield County, and the Green River Formation in Rio Blanco County had some of the highest median concentrations of selected constituents. Variations in concentration with depth could not be evaluated because of the general lack of well-depth and water-level data. Concentrations of several important constituents, such as arsenic, manganese, methane, and nitrate, were related to concentrations of dissolved oxygen. Concentrations of arsenic, manganese, and methane were significantly higher in groundwater with low dissolved-oxygen concentrations than in groundwater with high dissolved-oxygen concentrations. In contrast, concentrations of nitrate were significantly higher in groundwater with high dissolved-oxygen concentrations than in groundwater with low dissolved-oxygen concentrations. These results indicate that measurements of dissolved oxygen may be a useful indicator of groundwater vulnerability to some human-derived contaminants and enrichment from some natural constituents. Assessing such a large and diverse dataset as the one available through the repository poses unique challenges for reporting on groundwater quality in the study area. The repository contains data from several studies that differed widely in purpose and scope. In addition to this variability in available data, gaps exist spatially, temporally, and analytically in the repository. For example, groundwater-quality data in the repository were not evenly distributed throughout the study area. Several key water-quality constituents or indicators, such as dissolved oxygen, were underrepresented in the repository. Ancillary information, such as well depth, depth to water, and the geologic unit or aquifer in which a well was completed, was missing for more than 50 percent of samples. Future monitoring could avoid several limitations of the repository by making relatively minor changes to sample- collection and data-reporting protocols. Field measurements for dissolved oxygen could be added to sampling protocols, for example. Information on well construction and the geologic unit or aquifer in which a well was completed should be part of the water-quality dataset. Such changes would increase the comparability of data from different monitoring programs and also add value to each program individually and to that of the regional dataset as a whole. Other changes to monitoring programs could require greater resources, such as sampling for a basic set of constituents that is relevant to major water-quality issues in the regional study area. Creation of such a dataset for the regional study area would help to provide the kinds of information needed to characterize background conditions and the spatial and temporal variability in constituent concentrations associated with those conditions. Without such information, it is difficult to identify departures from background that might be associated with human activities.

  18. Diffusion Dominant Solute Transport Modelling In Deep Repository Under The Effect of Emplacement Media Degradation - 13285

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwong, S.; Jivkov, A.P.

    2013-07-01

    Deep geologic disposal of high activity and long-lived radioactive waste is being actively considered and pursued in many countries, where low permeability geological formations are used to provide long term waste contaminant with minimum impact to the environment and risk to the biosphere. A multi-barrier approach that makes use of both engineered and natural barriers (i.e. geological formations) is often used to further enhance the containment performance of the repository. As the deep repository system subjects to a variety of thermo-hydro-chemo-mechanical (THCM) effects over its long 'operational' lifespan (e.g. 0.1 to 1.0 million years, the integrity of the barrier systemmore » will decrease over time (e.g. fracturing in rock or clay)). This is broadly referred as media degradation in the present study. This modelling study examines the effects of media degradation on diffusion dominant solute transport in fractured media that are typical of deep geological environment. In particular, reactive solute transport through fractured media is studied using a 2-D model, that considers advection and diffusion, to explore the coupled effects of kinetic and equilibrium chemical processes, while the effects of degradation is studied using a pore network model that considers the media diffusivity and network changes. Model results are presented to demonstrate the use of a 3D pore-network model, using a novel architecture, to calculate macroscopic properties of the medium such as diffusivity, subject to pore space changes as the media degrade. Results from a reactive transport model of a representative geological waste disposal package are also presented to demonstrate the effect of media property change on the solute migration behaviour, illustrating the complex interplay between kinetic biogeochemical processes and diffusion dominant transport. The initial modelling results demonstrate the feasibility of a coupled modelling approach (using pore-network model and reactive transport model) to examine the long term behaviour of deep geological repositories with media property change under complex geochemical conditions. (authors)« less

  19. Thermodynamic Properties of Magnesium Chloride Hydroxide Hydrate (Mg3Cl(OH)5:4H2O, Phase 5), and Its importance to Nuclear Waste Isolation in Geological Repositories in Salt Formations

    NASA Astrophysics Data System (ADS)

    Xiong, Y.; Deng, H.; Nemer, M. B.; Johnsen, S.

    2009-12-01

    MgO (bulk, pure MgO corresponding to the mineral periclase) is the only engineered barrier certified by the Environmental Protection Agency for emplacement in the Waste Isolation Pilot Plant (WIPP) in the US, and an Mg(OH)2-based engineered barrier (bulk, pure Mg(OH)2 corresponding to brucite) is to be employed in the Asse repository in Germany. Both the WIPP and the Asse are located in salt formations. The WIPP is a U.S. Department of Energy geological repository being used for the permanent disposal of defense-related transuranic waste (TRU waste). The repository is 655 m below the surface, and is situated in the Salado Formation, a Permian salt bed mainly composed of halite, and of lesser amounts of polyhalite, anhydrite, gypsum, magnesite, clays and quartz. The WIPP Generic Weep Brine (GWB), a Na-Mg-Cl dominated brine, is associated with the Salado Formation. The previous vendor for MgO for the WIPP was Premier Chemicals and the current vendor is Martin Marietta Materials. Experimental studies of both Premier MgO and Martin Marietta MgO with the GWB at SNL indicate the formation of magnesium chloride hydroxide hydrate, Mg3Cl(OH)5:4H2O, termed as phase 5. However, this important phase is lacking in the existing thermodynamic database. In this study, the solubility constant of phase 5 is determined from a series of solubility experiments in MgCl2-NaCl solutions. The solubility constant at 25 oC for the following reaction, Mg3Cl(OH)5:4H2O + 5H+ = 3Mg2+ + 9H2O(l) + Cl- is recommended as 43.21±0.33 (2σ) based on the Specific Interaction Theory (SIT) model for extrapolation to infinite dilution. The log K obtained via the Pitzer equations is identical to the above value within the quoted uncertainty. The Gibbs free energy and enthalpy of formation for phase 5 at 25 oC are derived as -3384±2 (2σ) kJ mol-1 and -3896±6 (2σ) kJ mol-1, respectively. The standard entropy and heat capacity of phase 5 at 25 oC are estimated as 393±20 J mol-1 K-1 and 374±19 J mol-1 K-1, respectively. Phase 5, and its similar phase, phase 3 (Mg2Cl(OH)3:4H2O), could have a significant role in influencing the geochemical conditions in geological repositories for nuclear waste in salt formations where MgO or brucite is employed as engineered barriers, when Na-Mg-Cl dominated brines react with MgO or brucite. Based on our solubility constant for phase 5 in combination with the literature value for phase 3, we predict that the composition for the invariant point of phase 5 and phase 3 would be mMg = 1.70 and pmH = 8.93 in the Mg-Cl binary system. The recent WIPP Compliance Recertification Application PA Baseline Calculations indicate that phase 5 instead of phase 3 is indeed a stable phase when GWB equilibrates with actinide-source-term phases, brucite, magnesium carbonates, halite and anhydrite. 1. This research is funded by WIPP programs administered by the U.S. Department of Energy. 2. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  20. Digital Rocks Portal: Preservation, Sharing, Remote Visualization and Automated Analysis of Imaged Datasets

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.; Hanlon, M.; Pettengill, M.; Ranganath, A.; Venkatesh, A.

    2016-12-01

    Due to advances in imaging modalities such as X-ray microtomography and scattered electron microscopy, 2D and 3D imaged datasets of rock microstructure on nanometer to centimeter length scale allow investigation of nonlinear flow and mechanical phenomena using numerical approaches. This in turn produces various upscaled parameters required by subsurface flow and deformation simulators. However, a single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (http://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Our objective is to enable scientific inquiry and engineering decisions founded on a data-driven basis. We show how the data loaded in the portal can be documented, referenced in publications via digital object identifiers, visualize and linked to other repositories. We then show preliminary results on integrating remote parallel visualization and flow simulation workflow with the pore structures currently stored in the repository. We finally discuss the issues of collecting correct metadata, data discoverability and repository sustainability. This is the first repository for this particular data, but is part of the wider ecosystem of geoscience data and model cyber-infrastructure called "Earthcube" (http://earthcube.org/) sponsored by National Science Foundation. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  1. The NSF Arctic Data Center: Leveraging the DataONE Federation to Build a Sustainable Archive for the NSF Arctic Research Community

    NASA Astrophysics Data System (ADS)

    Budden, A. E.; Arzayus, K. M.; Baker-Yeboah, S.; Casey, K. S.; Dozier, J.; Jones, C. S.; Jones, M. B.; Schildhauer, M.; Walker, L.

    2016-12-01

    The newly established NSF Arctic Data Center plays a critical support role in archiving and curating the data and software generated by Arctic researchers from diverse disciplines. The Arctic community, comprising Earth science, archaeology, geography, anthropology, and other social science researchers, are supported through data curation services and domain agnostic tools and infrastructure, ensuring data are accessible in the most transparent and usable way possible. This interoperability across diverse disciplines within the Arctic community facilitates collaborative research and is mirrored by interoperability between the Arctic Data Center infrastructure and other large scale cyberinfrastructure initiatives. The Arctic Data Center leverages the DataONE federation to standardize access to and replication of data and metadata to other repositories, specifically the NOAA's National Centers for Environmental Information (NCEI). This approach promotes long-term preservation of the data and metadata, as well as opening the door for other data repositories to leverage this replication infrastructure with NCEI and other DataONE member repositories. The Arctic Data Center uses rich, detailed metadata following widely recognized standards. Particularly, measurement-level and provenance metadata provide scientists the details necessary to integrate datasets across studies and across repositories while enabling a full understanding of the provenance of data used in the system. The Arctic Data Center gains this deep metadata and provenance support by simply adopting DataONE services, which results in significant efficiency gains by eliminating the need to develop systems de novo. Similarly, the advanced search tool developed by the Knowledge Network for Biocomplexity and extended for data submission by the Arctic Data Center, can be used by other DataONE-compliant repositories without further development. By standardizing interfaces and leveraging the DataONE federation, the Arctic Data Center has advanced rapidly and can itself contribute to raising the capabilities of all members of the federation.

  2. PGP repository: a plant phenomics and genomics data publication infrastructure

    PubMed Central

    Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias

    2016-01-01

    Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents. The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles—findable, accessible, interoperable, reusable. Database URL: http://edal.ipk-gatersleben.de/repos/pgp/ PMID:27087305

  3. Organizing the present, looking to the future: an online knowledge repository to facilitate collaboration.

    PubMed

    Burchill, C; Roos, L L; Fergusson, P; Jebamani, L; Turner, K; Dueck, S

    2000-01-01

    Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research Methods, and facilitate both internal communication and collaboration with other sites. This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated.

  4. Organizing the Present, Looking to the Future: An Online Knowledge Repository to Facilitate Collaboration

    PubMed Central

    Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen

    2000-01-01

    Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929

  5. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    NASA Astrophysics Data System (ADS)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.

  6. Virtual patient repositories--a comparative analysis.

    PubMed

    Küfner, Julia; Kononowicz, Andrzej A; Hege, Inga

    2014-01-01

    Virtual Patients (VPs) are an important component of medical education. One way to reduce the costs for creating VPs is sharing through repositories. We conducted a literature review to identify existing repositories and analyzed the 17 included repositories in regards to the search functions and metadata they provide. Most repositories provided some metadata such as title or description, whereas other data, such as educational objectives, were less frequent. Future research could, in cooperation with the repository provider, investigate user expectations and usage patterns.

  7. The Function Biomedical Informatics Research Network Data Repository.

    PubMed

    Keator, David B; van Erp, Theo G M; Turner, Jessica A; Glover, Gary H; Mueller, Bryon A; Liu, Thomas T; Voyvodic, James T; Rasmussen, Jerod; Calhoun, Vince D; Lee, Hyo Jong; Toga, Arthur W; McEwen, Sarah; Ford, Judith M; Mathalon, Daniel H; Diaz, Michele; O'Leary, Daniel S; Jeremy Bockholt, H; Gadde, Syam; Preda, Adrian; Wible, Cynthia G; Stern, Hal S; Belger, Aysenil; McCarthy, Gregory; Ozyurt, Burak; Potkin, Steven G

    2016-01-01

    The Function Biomedical Informatics Research Network (FBIRN) developed methods and tools for conducting multi-scanner functional magnetic resonance imaging (fMRI) studies. Method and tool development were based on two major goals: 1) to assess the major sources of variation in fMRI studies conducted across scanners, including instrumentation, acquisition protocols, challenge tasks, and analysis methods, and 2) to provide a distributed network infrastructure and an associated federated database to host and query large, multi-site, fMRI and clinical data sets. In the process of achieving these goals the FBIRN test bed generated several multi-scanner brain imaging data sets to be shared with the wider scientific community via the BIRN Data Repository (BDR). The FBIRN Phase 1 data set consists of a traveling subject study of 5 healthy subjects, each scanned on 10 different 1.5 to 4 T scanners. The FBIRN Phase 2 and Phase 3 data sets consist of subjects with schizophrenia or schizoaffective disorder along with healthy comparison subjects scanned at multiple sites. In this paper, we provide concise descriptions of FBIRN's multi-scanner brain imaging data sets and details about the BIRN Data Repository instance of the Human Imaging Database (HID) used to publicly share the data. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    PubMed

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  9. Semantic Web repositories for genomics data using the eXframe platform.

    PubMed

    Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna

    2014-01-01

    With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.

  10. ClinicalCodes: An Online Clinical Codes Repository to Improve the Validity and Reproducibility of Research Using Electronic Medical Records

    PubMed Central

    Springate, David A.; Kontopantelis, Evangelos; Ashcroft, Darren M.; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects. PMID:24941260

  11. Sequence Resources at MaizeGDB with Emphasis on POPcorn: A Project Portal for Corn

    USDA-ARS?s Scientific Manuscript database

    MaizeGDB is the maize research community’s centralized, long-term repository for genetic and genomic information about the crop plant and model organism Zea mays ssp. mays. The MaizeGDB team endeavors to meet the needs of the maize research community based on feedback and guidance. Recent work has f...

  12. Natural analog studies: Licensing perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradbury, J.W.

    1995-09-01

    This report describes the licensing perspective of the term {open_quotes}natural analog studies{close_quotes} as used in CFR Part 60. It describes the misunderstandings related to its definition which has become evident during discussions at the U.S Nuclear Regulatory Commission meetings and tries to clarify the appropriate applications of natural analog studies to aspects of repository site characterization.

  13. Patterns of Learning Object Reuse in the Connexions Repository

    ERIC Educational Resources Information Center

    Duncan, S. M.

    2009-01-01

    Since the term "learning object" was first published, there has been either an explicit or implicit expectation of reuse. There has also been a lot of speculation about why learning objects are, or are not, reused. This study quantitatively examined the actual amount and type of learning object use, to include reuse, modification, and translation,…

  14. The Open Spectral Database: an open platform for sharing and searching spectral data.

    PubMed

    Chalk, Stuart J

    2016-01-01

    A number of websites make available spectral data for download (typically as JCAMP-DX text files) and one (ChemSpider) that also allows users to contribute spectral files. As a result, searching and retrieving such spectral data can be time consuming, and difficult to reuse if the data is compressed in the JCAMP-DX file. What is needed is a single resource that allows submission of JCAMP-DX files, export of the raw data in multiple formats, searching based on multiple chemical identifiers, and is open in terms of license and access. To address these issues a new online resource called the Open Spectral Database (OSDB) http://osdb.info/ has been developed and is now available. Built using open source tools, using open code (hosted on GitHub), providing open data, and open to community input about design and functionality, the OSDB is available for anyone to submit spectral data, making it searchable and available to the scientific community. This paper details the concept and coding, internal architecture, export formats, Representational State Transfer (REST) Application Programming Interface and options for submission of data. The OSDB website went live in November 2015. Concurrently, the GitHub repository was made available at https://github.com/stuchalk/OSDB/, and is open for collaborators to join the project, submit issues, and contribute code. The combination of a scripting environment (PHPStorm), a PHP Framework (CakePHP), a relational database (MySQL) and a code repository (GitHub) provides all the capabilities to easily develop REST based websites for ingestion, curation and exposure of open chemical data to the community at all levels. It is hoped this software stack (or equivalent ones in other scripting languages) will be leveraged to make more chemical data available for both humans and computers.

  15. nmrML: A Community Supported Open Data Standard for the Description, Storage, and Exchange of NMR Data.

    PubMed

    Schober, Daniel; Jacob, Daniel; Wilson, Michael; Cruz, Joseph A; Marcu, Ana; Grant, Jason R; Moing, Annick; Deborde, Catherine; de Figueiredo, Luis F; Haug, Kenneth; Rocca-Serra, Philippe; Easton, John; Ebbels, Timothy M D; Hao, Jie; Ludwig, Christian; Günther, Ulrich L; Rosato, Antonio; Klein, Matthias S; Lewis, Ian A; Luchinat, Claudio; Jones, Andrew R; Grauslys, Arturas; Larralde, Martin; Yokochi, Masashi; Kobayashi, Naohiro; Porzel, Andrea; Griffin, Julian L; Viant, Mark R; Wishart, David S; Steinbeck, Christoph; Salek, Reza M; Neumann, Steffen

    2018-01-02

    NMR is a widely used analytical technique with a growing number of repositories available. As a result, demands for a vendor-agnostic, open data format for long-term archiving of NMR data have emerged with the aim to ease and encourage sharing, comparison, and reuse of NMR data. Here we present nmrML, an open XML-based exchange and storage format for NMR spectral data. The nmrML format is intended to be fully compatible with existing NMR data for chemical, biochemical, and metabolomics experiments. nmrML can capture raw NMR data, spectral data acquisition parameters, and where available spectral metadata, such as chemical structures associated with spectral assignments. The nmrML format is compatible with pure-compound NMR data for reference spectral libraries as well as NMR data from complex biomixtures, i.e., metabolomics experiments. To facilitate format conversions, we provide nmrML converters for Bruker, JEOL and Agilent/Varian vendor formats. In addition, easy-to-use Web-based spectral viewing, processing, and spectral assignment tools that read and write nmrML have been developed. Software libraries and Web services for data validation are available for tool developers and end-users. The nmrML format has already been adopted for capturing and disseminating NMR data for small molecules by several open source data processing tools and metabolomics reference spectral libraries, e.g., serving as storage format for the MetaboLights data repository. The nmrML open access data standard has been endorsed by the Metabolomics Standards Initiative (MSI), and we here encourage user participation and feedback to increase usability and make it a successful standard.

  16. Intrinsic and Carrier Colloid-facilitated transport of lanthanides through discrete fractures in chalk

    NASA Astrophysics Data System (ADS)

    Weisbrod, N.; Tran, E. L.; Klein-BenDavid, O.; Teutsch, N.

    2015-12-01

    Geological disposal of high-level radioactive waste is the long term solution for the disposal of long lived radionuclides and spent fuel. However, some radionuclides might be released from these repositories into the subsurface as a result of leakage, which ultimately make their way into groundwater. Engineered bentonite barriers around nuclear waste repositories are generally considered sufficient to impede the transport of radionuclides from their source to the groundwater. However, colloidal-sized mobile bentonite particles ("carrier" colloids) originating from these barriers have come under investigation as a potential transport vector for radionuclides sorbed to them. As lanthanides are generally accepted to have the same chemical behaviors as their more toxic actinide counterparts, lanthanides are considered an acceptable substitute for research on radionuclide transportation. This study aims to evaluate the transport behaviors of lanthanides in colloid-facilitated transport through a fractured chalk matrix and under geochemical conditions representative the Negev desert, Israel. The migration of Ce both with and without colloidal particles was explored and compared to the migration of a conservative tracer (bromide) using a flow system constructed around a naturally fractured chalk core. Results suggest that mobility of Ce as a solute is negligible. In experiments conducted without bentonite colloids, the 1% of the Ce that was recovered migrated as "intrinsic" colloids in the form of carbonate precipitates. However, the total recovery of the Ce increased to 9% when it was injected into the core in the presence of bentonite colloids and 13% when both bentonite and precipitate colloids were injected. This indicates that lanthanides are essentially immobile in chalk as a solute but may be mobile as carbonate precipitates. Bentonite colloids, however, markedly increase the mobility of lanthanides through fractured chalk matrices.

  17. Learning Object Repositories

    ERIC Educational Resources Information Center

    Lehman, Rosemary

    2007-01-01

    This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)

  18. Evaluated teletherapy source library

    DOEpatents

    Cox, Lawrence J.; Schach Von Wittenau, Alexis E.

    2000-01-01

    The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.

  19. Transportation plan repository and archive.

    DOT National Transportation Integrated Search

    2011-04-01

    This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...

  20. Co-production of Health enabled by next generation personal health systems.

    PubMed

    Boye, Niels

    2012-01-01

    This paper describes the theoretical principles for the establishment of a parallel and complementary modality of healthcare delivery - named Coproduction of Health (CpH). This service-model activates digital data, information, and knowledge about health, healthy choices, and the individuals' health-state and computes through personalized models context-aware communication and advice. "Lightweight technologies" (smartphones, tablets, application stores) would serve as the technology close to the end-users (citizens, patients, clients, customers), connecting them with "big data" in conventionally and non-conventionally organized data repositories. The CpH modality aims at providing synergies between professional healthcare, selfcare, informal care and provides data-fusion from several sources such as health characteristics of consumer goods, from sensors, actuators, and health related data-repositories, and turns this into "health added value" for the individual. A theoretical business model respecting healthcare values, ethics, and legal foundation is also sketched out.

  1. Introducing the Brassica Information Portal: Towards integrating genotypic and phenotypic Brassica crop data

    PubMed Central

    Eckes, Annemarie H.; Gubała, Tomasz; Nowakowski, Piotr; Szymczyszyn, Tomasz; Wells, Rachel; Irwin, Judith A.; Horro, Carlos; Hancock, John M.; King, Graham; Dyer, Sarah C.; Jurkowski, Wiktor

    2017-01-01

    The Brassica Information Portal (BIP) is a centralised repository for brassica phenotypic data. The site hosts trait data associated with brassica research and breeding experiments conducted on brassica crops, that are used as oilseeds, vegetables, livestock forage and fodder and for biofuels. A key feature is the explicit management of meta-data describing the provenance and relationships between experimental plant materials, as well as trial design and trait descriptors. BIP is an open access and open source project, built on the schema of CropStoreDB, and as such can provide trait data management strategies for any crop data. A new user interface and programmatic submission/retrieval system helps to simplify data access for researchers, breeders and other end-users. BIP opens up the opportunity to apply integrative, cross-project analyses to data generated by the Brassica Research Community. Here, we present a short description of the current status of the repository. PMID:28529710

  2. The NIH BD2K center for big data in translational genomics

    PubMed Central

    Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; James Kent, W; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van’t Veer, Laura; Haussler, David

    2015-01-01

    The world’s genomics data will never be stored in a single repository – rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world’s genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM’s performance and utility. PMID:26174866

  3. Use of strategic environmental assessment in the site selection process for a radioactive waste disposal facility in Slovenia.

    PubMed

    Dermol, Urška; Kontić, Branko

    2011-01-01

    The benefits of strategic environmental considerations in the process of siting a repository for low- and intermediate-level radioactive waste (LILW) are presented. The benefits have been explored by analyzing differences between the two site selection processes. One is a so-called official site selection process, which is implemented by the Agency for radwaste management (ARAO); the other is an optimization process suggested by experts working in the area of environmental impact assessment (EIA) and land-use (spatial) planning. The criteria on which the comparison of the results of the two site selection processes has been based are spatial organization, environmental impact, safety in terms of potential exposure of the population to radioactivity released from the repository, and feasibility of the repository from the technical, financial/economic and social point of view (the latter relates to consent by the local community for siting the repository). The site selection processes have been compared with the support of the decision expert system named DEX. The results of the comparison indicate that the sites selected by ARAO meet fewer suitability criteria than those identified by applying strategic environmental considerations in the framework of the optimization process. This result stands when taking into account spatial, environmental, safety and technical feasibility points of view. Acceptability of a site by a local community could not have been tested, since the formal site selection process has not yet been concluded; this remains as an uncertain and open point of the comparison. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. SeaView: bringing EarthCube to the Oceanographer

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Diggs, S. C.; Arko, R. A.; Kinkade, D.; Shepherd, A.

    2016-12-01

    As new instrument types are developed, and new observational programs start, that support a growing community of "dry" oceanographers, the ability to find, access, and visualize existing data of interest becomes increasingly critical. Yet ocean data, when available, is are held in multiple data facilities, in different formats, and accessible through different pathways. This creates practical problems with integrating and working across different data sets. The SeaView project is building connections between the rich data resources in five major oceanographic data facilities - BCO-DMO, CCHDO, OBIS, OOI, and R2R* - creating a federated set of thematic data collections that are organized around common characteristics (geographic location, time, expedition, program, data type, etc.) and published online in Web Accessible Folders using standard file formats such as ODV and NetCDF. The work includes not simply reformatting data, but identifying and, where possible, addressing interoperability challenges: which common identifiers for core concepts can connect data across repositories, which terms a scientist may want to search that, if added to the data repositories, will increase discoverability; the presence of duplicate data across repositories, etc. We will present the data collections available to date, including data from the OOI Pioneer Array region, and seek scientists' input on the data types and formats they prefer, the tools they use to analyze and visualize data, and their specific recommendations for future data collections to support oceanographic science. * Biological and Chemical Oceanography Data Management Office (BCO-DMO), CLIVAR and Carbon Hydrographic Data Office (CCHDO), International Ocean Biogeographic Information System (iOBIS), Ocean Observatories Initiative (OOI), and Rolling Deck to Repository (R2R) Program.

  5. Enthalpies of formation of polyhalite: A mineral relevant to salt repository

    DOE PAGES

    Guo, Xiaofeng; Xu, Hongwu

    2017-06-02

    Polyhalite is an important coexisting mineral with halite in salt repositories for nuclear waste disposal, such as Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. The thermal stability of this mineral is a key knowledge in evaluating the integrity of a salt repository in the long term, as water may release due to thermal decomposition of polyhalite. Previous studies on structural evolution of polyhalite at elevated temperatures laid the basis for detailed calorimetric measurements. Using high-temperature oxide-melt drop-solution calorimetry at 975 K with sodium molybdate as the solvent, we have determined the standard enthalpies of formation from constituent sulfatesmore » (ΔH° f,sul), oxides (ΔH° f,ox) and elements (ΔH° f,ele) of a polyhalite sample with the composition of K 2Ca 2Mg(SO 4) 4·1.95H 2O from the Salado formation at the WIPP site. The obtained results are: ΔH° f,sul = -152.5 ± 5.3 kJ/mol, ΔH° f,ox = -1926.1 ± 10.5 kJ/mol, and ΔH° f,ele = -6301.2 ± 9.9 kJ/mol. Furthermore, based on the estimated formation entropies of polyhalite, its standard Gibbs free energy of formation has been derived to be in the range of -5715.3 ± 9.9 kJ/mol to -5739.3 ± 9.9 kJ/mol. In conclusion, these determined thermodynamic properties provide fundamental parameters for modeling the stability behavior of polyhalite in salt repositories.« less

  6. Soldier System Power Sources

    DTIC Science & Technology

    2006-12-31

    dependence, and estimated mass of the stack. The model equations were derived from peer reviewed academic journals , internal studies, and texts on the subject...Liu, R. Dougal, E. Solodovnik, "VTB-Based Design of a Standalone Photovoltaic Power System", International Journal of Green Energy, Vol. 1, No. 3...Powered Battery Chargers 17 Exergy minimization 19 Use of secondary cells as temporary energy repositories 19 Design an automatic energy optimization

  7. Dynamics of Chemical Degradation in Water Using Photocatalytic Reactions in an Ultraviolet Light Emitting Diode Reactor

    DTIC Science & Technology

    2017-09-14

    one such study, AOPs were investigated for the removal of organophosphorus pesticides in wastewater by selecting and optimizing oxidation processes...micropollutants (primarily pharmaceuticals, personal care products, and pesticides ) in four 64 different river water sources (Colorado River, Passaic...the National Institutes of Health PubChem data repository (National Institutes of Health 2016). Additional chemical properties were also selected for

  8. A Republican Literature: A Study of Magazine Reading and Readers in Late-Eighteenth-Century New York.

    ERIC Educational Resources Information Center

    Nord, David Paul

    A study focusing on the history of reading, or the uses of literacy, in the first years of the American republic examined the subscription list and content of "The New York Magazine; or, Literary Repository" for 1790. Data for the study were taken from the magazine's subscription list and from various biographical sources, such as the…

  9. "The Glory and Romance of Our History Are Here Preserved." An Introduction to the Records of the National Archives.

    ERIC Educational Resources Information Center

    National Archives and Records Administration, Washington, DC. Office of Public Programs.

    This publication is intended for teachers bringing a class to visit the National Archives in Washington, D.C., for a workshop on primary documents. The National Archives serves as the repository for all federal records of enduring value. Primary sources are vital teaching tools because they actively engage the student's imagination so that he or…

  10. Repository of not readily available documents for project W-320

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conner, J.C.

    1997-04-18

    The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line.

  11. NDE of copper canisters for long-term storage of spent nuclear fuel from the Swedish nuclear power plants

    NASA Astrophysics Data System (ADS)

    Stepinski, Tadeusz

    2003-07-01

    Sweden has been intensively developing methods for long term storage of spent fuel from the nuclear power plants for twenty-five years. A dedicated research program has been initiated and conducted by the Swedish company SKB (Swedish Nuclear Fuels and Waste Management Co.). After the interim storage SKB plans to encapsulate spent nuclear fuel in copper canisters that will be placed at a deep repository located in bedrock. The canisters filled with fuel rods will be sealed by an electron beam weld. This paper presents three complementary NDE techniques used for assessing the sealing weld in copper canisters, radiography, ultrasound, and eddy current. A powerful X-ray source and a digital detector are used for the radiography. An ultrasonic array system consisting of a phased ultrasonic array and a multi-channel electronics is used for the ultrasonic examination. The array system enables electronic focusing and rapid electronic scanning eliminating the use of a complicated mechanical scanner. A specially designed eddy current probe capable of detecting small voids at the depth up to 4 mm in copper is used for the eddy current inspection. Presently, all the NDE techniques are verified in SKB's Canister Laboratory where full scale canisters are welded and examined.

  12. Supporting multiple domains in a single reuse repository

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    Domain analysis typically results in the construction of a domain-specific repository. Such a repository imposes artificial boundaries on the sharing of similar assets between related domains. A lattice-based approach to repository modeling can preserve a reuser's domain specific view of the repository, while avoiding replication of commonly used assets and supporting a more general perspective on domain interrelationships.

  13. Geologic and geophysical characterization studies of Yucca Mountain, Nevada, a potential high-level radioactive-waste repository

    USGS Publications Warehouse

    Whitney, J.W.; Keefer, W.R.

    2000-01-01

    In recognition of a critical national need for permanent radioactive-waste storage, Yucca Mountain in southwestern Nevada has been investigated by Federal agencies since the 1970's, as a potential geologic disposal site. In 1987, Congress selected Yucca Mountain for an expanded and more detailed site characterization effort. As an integral part of this program, the U.S. Geological Survey began a series of detailed geologic, geophysical, and related investigations designed to characterize the tectonic setting, fault behavior, and seismicity of the Yucca Mountain area. This document presents the results of 13 studies of the tectonic environment of Yucca Mountain, in support of a broad goal to assess the effects of future seismic and fault activity in the area on design, long-term performance, and safe operation of the potential surface and subsurface repository facilities.

  14. NCI Mouse Repository | FNLCR Staging

    Cancer.gov

    The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains

  15. Dissemination of metabolomics results: role of MetaboLights and COSMOS.

    PubMed

    Salek, Reza M; Haug, Kenneth; Steinbeck, Christoph

    2013-05-17

    With ever-increasing amounts of metabolomics data produced each year, there is an even greater need to disseminate data and knowledge produced in a standard and reproducible way. To assist with this a general purpose, open source metabolomics repository, MetaboLights, was launched in 2012. To promote a community standard, initially culminated as metabolomics standards initiative (MSI), COordination of Standards in MetabOlomicS (COSMOS) was introduced. COSMOS aims to link life science e-infrastructures within the worldwide metabolomics community as well as develop and maintain open source exchange formats for raw and processed data, ensuring better flow of metabolomics information.

  16. Long-term non-isothermal reactive transport model of compacted bentonite, concrete and corrosion products in a HLW repository in clay

    NASA Astrophysics Data System (ADS)

    Mon, Alba; Samper, Javier; Montenegro, Luis; Naves, Acacia; Fernández, Jesús

    2017-02-01

    Radioactive waste disposal in deep geological repositories envisages engineered barriers such as carbon-steel canisters, compacted bentonite and concrete liners. The stability and performance of the bentonite barrier could be affected by the corrosion products at the canister-bentonite interface and the hyper-alkaline conditions caused by the degradation of concrete at the bentonite-concrete interface. Additionally, the host clay formation could also be affected by the hyper-alkaline plume at the concrete-clay interface. Here we present a non-isothermal multicomponent reactive transport model of the long-term (1 Ma) interactions of the compacted bentonite with the corrosion products of a carbon-steel canister and the concrete liner of the engineered barrier of a high-level radioactive waste repository in clay. Model results show that magnetite is the main corrosion product. Its precipitation reduces significantly the porosity of the bentonite near the canister. The degradation of the concrete liner leads to the precipitation of secondary minerals and the reduction of the porosity of the bentonite and the clay formation at their interfaces with the concrete liner. The reduction of the porosity becomes especially relevant at t = 104 years. The zones affected by pore clogging at the canister-bentonite and concrete-clay interfaces at 1 Ma are approximately equal to 1 and 3.3 cm thick, respectively. The hyper-alkaline front (pH > 8.5) spreads 2.5 cm into the clay formation after 1 Ma. Our simulation results share the key features of the models reported by others for engineered barrier systems at similar chemical conditions, including: 1) Pore clogging at the canister-bentonite and concrete-clay interfaces; 2) Narrow alteration zones; and 3) Limited smectite dissolution after 1 Ma.

  17. A comparative study of discrete fracture network and equivalent continuum models for simulating flow and transport in the far field of a hypothetical nuclear waste repository in crystalline host rock

    NASA Astrophysics Data System (ADS)

    Hadgu, Teklu; Karra, Satish; Kalinina, Elena; Makedonska, Nataliia; Hyman, Jeffrey D.; Klise, Katherine; Viswanathan, Hari S.; Wang, Yifeng

    2017-10-01

    One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. We compare DFN and ECM in terms of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. We identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.

  18. Enhancing Ocean Research Data Access

    NASA Astrophysics Data System (ADS)

    Chandler, Cynthia; Groman, Robert; Shepherd, Adam; Allison, Molly; Arko, Robert; Chen, Yu; Fox, Peter; Glover, David; Hitzler, Pascal; Leadbetter, Adam; Narock, Thomas; West, Patrick; Wiebe, Peter

    2014-05-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. While the ultimate goal of the BCO-DMO is to ensure preservation of NSF funded project data and to provide open access to those data, achievement of those goals is attained through a series of related phases that benefits from active collaboration and cooperation with a large community of research scientists as well as curators of data and information at complementary data repositories. The BCO-DMO is just one of many intermediate data management centers created to facilitate long-term preservation of data and improve access to ocean research data. Through partnerships with other data management professionals and active involvement in local and global initiatives, BCO-DMO staff members are working to enhance access to ocean research data available from the online BCO-DMO data system. Continuing efforts in use of controlled vocabulary terms, development of ontology design patterns and publication of content as Linked Open Data are contributing to improved discovery and availability of BCO-DMO curated data and increased interoperability of related content available from distributed repositories. We will demonstrate how Semantic Web technologies (e.g. RDF/XML, SKOS, OWL and SPARQL) have been integrated into BCO-DMO data access and delivery systems to better serve the ocean research community and to contribute to an expanding global knowledge network.

  19. Report of the wwPDB Small-Angle Scattering Task Force: data requirements for biomolecular modeling and the PDB.

    PubMed

    Trewhella, Jill; Hendrickson, Wayne A; Kleywegt, Gerard J; Sali, Andrej; Sato, Mamoru; Schwede, Torsten; Svergun, Dmitri I; Tainer, John A; Westbrook, John; Berman, Helen M

    2013-06-04

    This report presents the conclusions of the July 12-13, 2012 meeting of the Small-Angle Scattering Task Force of the worldwide Protein Data Bank (wwPDB; Berman et al., 2003) at Rutgers University in New Brunswick, New Jersey. The task force includes experts in small-angle scattering (SAS), crystallography, data archiving, and molecular modeling who met to consider questions regarding the contributions of SAS to modern structural biology. Recognizing there is a rapidly growing community of structural biology researchers acquiring and interpreting SAS data in terms of increasingly sophisticated molecular models, the task force recommends that (1) a global repository is needed that holds standard format X-ray and neutron SAS data that is searchable and freely accessible for download; (2) a standard dictionary is required for definitions of terms for data collection and for managing the SAS data repository; (3) options should be provided for including in the repository SAS-derived shape and atomistic models based on rigid-body refinement against SAS data along with specific information regarding the uniqueness and uncertainty of the model, and the protocol used to obtain it; (4) criteria need to be agreed upon for assessment of the quality of deposited SAS data and the accuracy of SAS-derived models, and the extent to which a given model fits the SAS data; (5) with the increasing diversity of structural biology data and models being generated, archiving options for models derived from diverse data will be required; and (6) thought leaders from the various structural biology disciplines should jointly define what to archive in the PDB and what complementary archives might be needed, taking into account both scientific needs and funding. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Long-term non-isothermal reactive transport model of compacted bentonite, concrete and corrosion products in a HLW repository in clay.

    PubMed

    Mon, Alba; Samper, Javier; Montenegro, Luis; Naves, Acacia; Fernández, Jesús

    2017-02-01

    Radioactive waste disposal in deep geological repositories envisages engineered barriers such as carbon-steel canisters, compacted bentonite and concrete liners. The stability and performance of the bentonite barrier could be affected by the corrosion products at the canister-bentonite interface and the hyper-alkaline conditions caused by the degradation of concrete at the bentonite-concrete interface. Additionally, the host clay formation could also be affected by the hyper-alkaline plume at the concrete-clay interface. Here we present a non-isothermal multicomponent reactive transport model of the long-term (1Ma) interactions of the compacted bentonite with the corrosion products of a carbon-steel canister and the concrete liner of the engineered barrier of a high-level radioactive waste repository in clay. Model results show that magnetite is the main corrosion product. Its precipitation reduces significantly the porosity of the bentonite near the canister. The degradation of the concrete liner leads to the precipitation of secondary minerals and the reduction of the porosity of the bentonite and the clay formation at their interfaces with the concrete liner. The reduction of the porosity becomes especially relevant at t=10 4 years. The zones affected by pore clogging at the canister-bentonite and concrete-clay interfaces at 1Ma are approximately equal to 1 and 3.3cm thick, respectively. The hyper-alkaline front (pH>8.5) spreads 2.5cm into the clay formation after 1Ma. Our simulation results share the key features of the models reported by others for engineered barrier systems at similar chemical conditions, including: 1) Pore clogging at the canister-bentonite and concrete-clay interfaces; 2) Narrow alteration zones; and 3) Limited smectite dissolution after 1Ma. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Chemical Transport in a Fissured Rock: Verification of a Numerical Model

    NASA Astrophysics Data System (ADS)

    Rasmuson, A.; Narasimhan, T. N.; Neretnieks, I.

    1982-10-01

    Numerical models for simulating chemical transport in fissured rocks constitute powerful tools for evaluating the acceptability of geological nuclear waste repositories. Due to the very long-term, high toxicity of some nuclear waste products, the models are required to predict, in certain cases, the spatial and temporal distribution of chemical concentration less than 0.001% of the concentration released from the repository. Whether numerical models can provide such accuracies is a major question addressed in the present work. To this end we have verified a numerical model, TRUMP, which solves the advective diffusion equation in general three dimensions, with or without decay and source terms. The method is based on an integrated finite difference approach. The model was verified against known analytic solution of the one-dimensional advection-diffusion problem, as well as the problem of advection-diffusion in a system of parallel fractures separated by spherical particles. The studies show that as long as the magnitude of advectance is equal to or less than that of conductance for the closed surface bounding any volume element in the region (that is, numerical Peclet number <2), the numerical method can indeed match the analytic solution within errors of ±10-3% or less. The realistic input parameters used in the sample calculations suggest that such a range of Peclet numbers is indeed likely to characterize deep groundwater systems in granitic and ancient argillaceous systems. Thus TRUMP in its present form does provide a viable tool for use in nuclear waste evaluation studies. A sensitivity analysis based on the analytic solution suggests that the errors in prediction introduced due to uncertainties in input parameters are likely to be larger than the computational inaccuracies introduced by the numerical model. Currently, a disadvantage in the TRUMP model is that the iterative method of solving the set of simultaneous equations is rather slow when time constants vary widely over the flow region. Although the iterative solution may be very desirable for large three-dimensional problems in order to minimize computer storage, it seems desirable to use a direct solver technique in conjunction with the mixed explicit-implicit approach whenever possible. Work in this direction is in progress.

  2. Use of groundwater lifetime expectancy for the performance assessment of a deep geologic radioactive waste repository: 2. Application to a Canadian Shield environment

    NASA Astrophysics Data System (ADS)

    Park, Y.-J.; Cornaton, F. J.; Normani, S. D.; Sykes, J. F.; Sudicky, E. A.

    2008-04-01

    F. J. Cornaton et al. (2008) introduced the concept of lifetime expectancy as a performance measure of the safety of subsurface repositories, on the basis of the travel time for contaminants released at a certain point in the subsurface to reach the biosphere or compliance area. The methodologies are applied to a hypothetical but realistic Canadian Shield crystalline rock environment, which is considered to be one of the most geologically stable areas on Earth. In an approximately 10 × 10 × 1.5 km3 hypothetical study area, up to 1000 major and intermediate fracture zones are generated from surface lineament analyses and subsurface surveys. In the study area, mean and probability density of lifetime expectancy are analyzed with realistic geologic and hydrologic shield settings in order to demonstrate the applicability of the theory and the numerical model for optimally locating a deep subsurface repository for the safe storage of spent nuclear fuel. The results demonstrate that, in general, groundwater lifetime expectancy increases with depth and it is greatest inside major matrix blocks. Various sources and aspects of uncertainty are considered, specifically geometric and hydraulic parameters of permeable fracture zones. Sensitivity analyses indicate that the existence and location of permeable fracture zones and the relationship between fracture zone permeability and depth from ground surface are the most significant factors for lifetime expectancy distribution in such a crystalline rock environment. As a consequence, it is successfully demonstrated that the concept of lifetime expectancy can be applied to siting and performance assessment studies for deep geologic repositories in crystalline fractured rock settings.

  3. Use of a Knowledge Management System in Waste Management Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruendler, D.; Boetsch, W.U.; Holzhauer, U.

    2006-07-01

    In Germany the knowledge management system 'WasteInfo' about waste management and disposal issues has been developed and implemented. Beneficiaries of 'WasteInfo' are official decision makers having access to a large information pool. The information pool is fed by experts, so called authors This means compiling of information, evaluation and assigning of appropriate properties (metadata) to this information. The knowledge management system 'WasteInfo' has been introduced at the WM04, the operation of 'WasteInfo' at the WM05. The recent contribution describes the additional advantage of the KMS being used as a tool for the dealing with waste management projects. This specific aspectmore » will be demonstrated using a project concerning a comparative analysis of the implementation of repositories in six countries using nuclear power as examples: The information of 'WasteInfo' is assigned to categories and structured according to its origin and type of publication. To use 'WasteInfo' as a tool for the processing the projects, a suitable set of categories has to be developed for each project. Apart from technical and scientific aspects, the selected project deals with repository strategies and policies in various countries, with the roles of applicants and authorities in licensing procedures, with safety philosophy and with socio-economic concerns. This new point of view has to be modelled in the categories. Similar to this, new sources of information such as local and regional dailies or particular web-sites have to be taken into consideration. In this way 'WasteInfo' represents an open document which reflects the current status of the respective repository policy in several countries. Information with particular meaning for the German repository planning is marked and by this may influence the German strategy. (authors)« less

  4. Coupling fuel cycles with repositories: how repository institutional choices may impact fuel cycle design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.; Miller, W.F.

    2013-07-01

    The historical repository siting strategy in the United States has been a top-down approach driven by federal government decision making but it has been a failure. This policy has led to dispatching fuel cycle facilities in different states. The U.S. government is now considering an alternative repository siting strategy based on voluntary agreements with state governments. If that occurs, state governments become key decision makers. They have different priorities. Those priorities may change the characteristics of the repository and the fuel cycle. State government priorities, when considering hosting a repository, are safety, financial incentives and jobs. It follows that statesmore » will demand that a repository be the center of the back end of the fuel cycle as a condition of hosting it. For example, states will push for collocation of transportation services, safeguards training, and navy/private SNF (Spent Nuclear Fuel) inspection at the repository site. Such activities would more than double local employment relative to what was planned for the Yucca Mountain-type repository. States may demand (1) the right to take future title of the SNF so if recycle became economic the reprocessing plant would be built at the repository site and (2) the right of a certain fraction of the repository capacity for foreign SNF. That would open the future option of leasing of fuel to foreign utilities with disposal of the SNF in the repository but with the state-government condition that the front-end fuel-cycle enrichment and fuel fabrication facilities be located in that state.« less

  5. Focused Crawling of the Deep Web Using Service Class Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  6. NCI Mouse Repository | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains

  7. The Environmental Data Initiative: A broad-use data repository for environmental and ecological data that strives to balance data quality and ease of submission

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.

  8. Durability and degradation of HT9 based alloy waste forms with variable Ni and Cr content

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, L.

    2016-12-31

    Short-term electrochemical and long-term hybrid electrochemical corrosion tests were performed on alloy waste forms in reference aqueous solutions that bound postulated repository conditions. The alloy waste forms investigated represent candidate formulations that can be produced with advanced electrochemical treatment of used nuclear fuel. The studies helped to better understand the alloy waste form durability with differing concentrations of nickel and chromium, species that can be added to alloy waste forms to potentially increase their durability and decrease radionuclide release into the environment.

  9. Review of CBRN Medical and Operational Terminologies in NATO CBRN Publications

    DTIC Science & Technology

    2016-08-01

    repository for NATO terminology and is used to search terms, abbreviations, and definitions found in NATO documents, communications, and activities of all...21 is a compilation of terminology used in NATO chemical, biological, radiological, and nuclear defense activities , documentation, and communications...informational constructs, activities , and functionality necessary for: 1. Reporting of all chemical, biological or radiological incidents and nuclear

  10. The important of living botanical collections for plant biology and the “next generation” of evo-devo research

    Treesearch

    Michael Dosmann; Andrew Groover

    2012-01-01

    Living botanical collections include germplasm repositories, long-term experimental plantings, and botanical gardens. We present here a series of vignettes to illustrate the central role that living collections have played in plant biology research, including evo-devo research. Looking towards the future, living collections will become increasingly important in support...

  11. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    NASA Technical Reports Server (NTRS)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  12. Data repositories for medical education research: issues and recommendations.

    PubMed

    Schwartz, Alan; Pappas, Cleo; Sandlow, Leslie J

    2010-05-01

    The authors explore issues surrounding digital repositories with the twofold intention of clarifying their creation, structure, content, and use, and considering the implementation of a global digital repository for medical education research data sets-an online site where medical education researchers would be encouraged to deposit their data in order to facilitate the reuse and reanalysis of the data by other researchers. By motivating data sharing and reuse, investigators, medical schools, and other stakeholders might see substantial benefits to their own endeavors and to the progress of the field of medical education.The authors review digital repositories in medicine, social sciences, and education, describe the contents and scope of repositories, and present extant examples. The authors describe the potential benefits of a medical education data repository and report results of a survey of the Society for Directors of Research in Medicine Education, in which participants responded to questions about data sharing and a potential data repository. Respondents strongly endorsed data sharing, with the caveat that principal investigators should choose whether or not to share data they collect. A large majority believed that a repository would benefit their unit and the field of medical education. Few reported using existing repositories. Finally, the authors consider challenges to the establishment of such a repository, including taxonomic organization, intellectual property concerns, human subjects protection, technological infrastructure, and evaluation standards. The authors conclude with recommendations for how a medical education data repository could be successfully developed.

  13. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... repositories. 227.7207 Section 227.7207 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to...

  14. 75 FR 70310 - Sunshine Act Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ... Consumer Protection Act governing the security-based swap data repository registration process, the duties of such repositories, and the core principles applicable to such repositories. 4. The Commission will... security-based swap data repositories or the Commission and the public dissemination of security-based swap...

  15. International Approaches for Nuclear Waste Disposal in Geological Formations: Geological Challenges in Radioactive Waste Isolation—Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Sassani, David

    The overall objective of the Fifth Worldwide Review (WWR-5) is to document the current state-of-the-art of major developments in a number of nations throughout the World pursuing geological disposal programs, and to summarize challenging problems and experience that have been obtained in siting, preparing and reviewing cases for the operational and long-term safety of proposed and operating nuclear waste repositories. The scope of the Review is to address current specific technical issues and challenges in safety case development along with the interplay of technical feasibility, siting, engineering design issues, and operational and post-closure safety. In particular, the chapters included inmore » the report present the following types of information: the current status of the deep geological repository programs for high level nuclear waste and low- and intermediate level nuclear waste in each country, concepts of siting and radioactive waste and spent nuclear fuel management in different countries (with the emphasis of nuclear waste disposal under different climatic conditions and different geological formations), progress in repository site selection and site characterization, technology development, buffer/backfill materials studies and testing, support activities, programs, and projects, international cooperation, and future plans, as well as regulatory issues and transboundary problems.« less

  16. Gamma radiation induces hydrogen absorption by copper in water

    NASA Astrophysics Data System (ADS)

    Lousada, Cláudio M.; Soroka, Inna L.; Yagodzinskyy, Yuriy; Tarakina, Nadezda V.; Todoshchenko, Olga; Hänninen, Hannu; Korzhavyi, Pavel A.; Jonsson, Mats

    2016-04-01

    One of the most intricate issues of nuclear power is the long-term safety of repositories for radioactive waste. These repositories can have an impact on future generations for a period of time orders of magnitude longer than any known civilization. Several countries have considered copper as an outer corrosion barrier for canisters containing spent nuclear fuel. Among the many processes that must be considered in the safety assessments, radiation induced processes constitute a key-component. Here we show that copper metal immersed in water uptakes considerable amounts of hydrogen when exposed to γ-radiation. Additionally we show that the amount of hydrogen absorbed by copper depends on the total dose of radiation. At a dose of 69 kGy the uptake of hydrogen by metallic copper is 7 orders of magnitude higher than when the absorption is driven by H2(g) at a pressure of 1 atm in a non-irradiated dry system. Moreover, irradiation of copper in water causes corrosion of the metal and the formation of a variety of surface cavities, nanoparticle deposits, and islands of needle-shaped crystals. Hence, radiation enhanced uptake of hydrogen by spent nuclear fuel encapsulating materials should be taken into account in the safety assessments of nuclear waste repositories.

  17. Protein Structure Initiative Material Repository: an open shared public resource of structural genomics plasmids for the biological community

    PubMed Central

    Cormier, Catherine Y.; Mohr, Stephanie E.; Zuo, Dongmei; Hu, Yanhui; Rolfs, Andreas; Kramer, Jason; Taycher, Elena; Kelley, Fontina; Fiacco, Michael; Turnbull, Greggory; LaBaer, Joshua

    2010-01-01

    The Protein Structure Initiative Material Repository (PSI-MR; http://psimr.asu.edu) provides centralized storage and distribution for the protein expression plasmids created by PSI researchers. These plasmids are a resource that allows the research community to dissect the biological function of proteins whose structures have been identified by the PSI. The plasmid annotation, which includes the full length sequence, vector information and associated publications, is stored in a freely available, searchable database called DNASU (http://dnasu.asu.edu). Each PSI plasmid is also linked to a variety of additional resources, which facilitates cross-referencing of a particular plasmid to protein annotations and experimental data. Plasmid samples can be requested directly through the website. We have also developed a novel strategy to avoid the most common concern encountered when distributing plasmids namely, the complexity of material transfer agreement (MTA) processing and the resulting delays this causes. The Expedited Process MTA, in which we created a network of institutions that agree to the terms of transfer in advance of a material request, eliminates these delays. Our hope is that by creating a repository of expression-ready plasmids and expediting the process for receiving these plasmids, we will help accelerate the accessibility and pace of scientific discovery. PMID:19906724

  18. Designing for Change: Interoperability in a scaling and adapting environment

    NASA Astrophysics Data System (ADS)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  19. Modeling the impact of climate change in Germany with biosphere models for long-term safety assessment of nuclear waste repositories.

    PubMed

    Staudt, C; Semiochkina, N; Kaiser, J C; Pröhl, G

    2013-01-01

    Biosphere models are used to evaluate the exposure of populations to radionuclides from a deep geological repository. Since the time frame for assessments of long-time disposal safety is 1 million years, potential future climate changes need to be accounted for. Potential future climate conditions were defined for northern Germany according to model results from the BIOCLIM project. Nine present day reference climate regions were defined to cover those future climate conditions. A biosphere model was developed according to the BIOMASS methodology of the IAEA and model parameters were adjusted to the conditions at the reference climate regions. The model includes exposure pathways common to those reference climate regions in a stylized biosphere and relevant to the exposure of a hypothetical self-sustaining population at the site of potential radionuclide contamination from a deep geological repository. The end points of the model are Biosphere Dose Conversion factors (BDCF) for a range of radionuclides and scenarios normalized for a constant radionuclide concentration in near-surface groundwater. Model results suggest an increased exposure of in dry climate regions with a high impact of drinking water consumption rates and the amount of irrigation water used for agriculture. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Information prescriptions, 1930-2013: an international history and comprehensive review.

    PubMed

    McKnight, Michelynn

    2014-10-01

    Recently, government agencies in several countries have promoted information prescription programs to increase patients' understanding of their conditions. The practice has a long history and many publications, but no comprehensive literature reviews such as this. Using a variety of high-precision and high-recall strategies, the researcher searched two dozen online bibliographic databases, citation databases, and repositories, as well as many print sources, to identify and retrieve documents for review. Of these documents, ninety relevant English-language case reports, research reports, and reviews published from 1930-2013 met the study criteria. Early to mid-twentieth century reports covered long-standing practices and used no rigorous research methods. The literature since the mid-1990s reports on short-term trial projects, especially of government-sponsored programs in the United States and United Kingdom. Although the concept of information prescription has been in the literature and practiced for decades, no long-term research studies were found. Most of the literature is anecdotal concerning small pilot projects. The reports investigate physician, patient, and librarian satisfaction but not changes in patient knowledge or behavior. Many twenty-first century projects emphasize materials and projects from specific government agencies and commercial enterprises. While the practice is commonly believed to be a good idea and there are many publications on the subject, few studies provide any evidence of the efficacy of information prescriptions for increased patient knowledge. Well-designed and executed large or long-term studies might produce needed evidence for professional practice.

  1. Software Hardware Asset Reuse Enterprise (SHARE) Repository Framework: Related Work and Development Plan

    DTIC Science & Technology

    2009-08-19

    designed to collect the data and assist the analyst in drawing relationships between the data. Palantir Technologies has created one such software...application to support the DoD intelligence community by providing robust capabilities for managing data from various sources10. The Palantir tool...www.palantirtech.com/ - 38 - Figure 17. Palantir Graphical Interface (Gordon-Schlosberg, 2008) Similar examples of the use of ontologies to support data

  2. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  3. The Morningside Initiative: Collaborative Development of a Knowledge Repository to Accelerate Adoption of Clinical Decision Support

    DTIC Science & Technology

    2010-01-01

    Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement

  4. Reproducible Research in the Geosciences at Scale: Achievable Goal or Elusive Dream?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.

    2016-12-01

    Reproducibility is a fundamental tenant of the scientific method: it implies that any researcher, or a third party working independently, can duplicate any experiment or investigation and produce the same results. Historically computationally based research involved an individual using their own data and processing it in their own private area, often using software they wrote or inherited from close collaborators. Today, a researcher is likely to be part of a large team that will use a subset of data from an external repository and then process the data on a public or private cloud or on a large centralised supercomputer, using a mixture of their own code, third party software and libraries, or global community codes. In 'Big Geoscience' research it is common for data inputs to be extracts from externally managed dynamic data collections, where new data is being regularly appended, or existing data is revised when errors are detected and/or as processing methods are improved. New workflows increasingly use services to access data dynamically to create subsets on-the-fly from distributed sources, each of which can have a complex history. At major computational facilities, underlying systems, libraries, software and services are being constantly tuned and optimised, or as new or replacement infrastructure being installed. Likewise code used from a community repository is continually being refined, re-packaged and ported to the target platform. To achieve reproducibility, today's researcher increasingly needs to track their workflow, including querying information on the current or historical state of facilities used. Versioning methods are standard practice for software repositories or packages, but it is not common for either data repositories or data services to provide information about their state, or for systems to provide query-able access to changes in the underlying software. While a researcher can achieve transparency and describe steps in their workflow so that others can repeat them and replicate processes undertaken, they cannot achieve exact reproducibility or even transparency of results generated. In Big Geoscience, full reproducibiliy will be an elusive dream until data repositories and compute facilities can provide provenance information in a standards compliant, machine query-able way.

  5. Plutonium in the WIPP environment: its detection, distribution and behavior.

    PubMed

    Thakur, P; Ballard, S; Nelson, R

    2012-05-01

    The Waste Isolation Pilot Plant (WIPP) is the only operating deep underground geologic nuclear repository in the United States. It is located in southeastern New Mexico, approximately 655 m (2150 ft) below the surface of the Earth in a bedded Permian evaporite salt formation. This mined geologic repository is designed for the safe disposal of transuranic (TRU) wastes generated from the US defense program. Aerosol and soil samples have been collected near the WIPP site to investigate the sources of plutonium in the WIPP environment since the late 1990s, well before WIPP received its first shipment. Activities of (238)Pu, (239+240)Pu and (241)Am were determined by alpha spectrometry following a series of chemical separations. The concentrations of Al and U were determined in a separate set of samples by inductively coupled plasma mass spectrometry. The annual airborne concentrations of (239+240)Pu during the period from 1998 to 2010 show no systematic interannual variations. However, monthly (239+240)Pu particulate concentrations show a typical seasonal variation with a maximum in spring, the time when strong and gusty winds frequently give rise to blowing dust. Resuspension of soil particles containing weapons fallout is considered to be the predominant source of plutonium in the WIPP area. Further, this work characterizes the source, temporal variation and its distribution with depth in a soil profile to evaluate the importance of transport mechanisms affecting the fate of these radionuclides in the WIPP environment. The mean (137)Cs/(239+240)Pu, (241)Am/(239+240)Pu activity ratio and (240)Pu/(239)Pu atom ratio observed in the WIPP samples are consistent with the source being largely global fallout. There is no evidence of any release from the WIPP contributing to radionuclide concentrations in the environment.

  6. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... data repository's policies and procedures reasonably designed to protect the privacy of any and all... swap data repositories. 49.26 Section 49.26 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data...

  7. GeoSciGraph: An Ontological Framework for EarthCube Semantic Infrastructure

    NASA Astrophysics Data System (ADS)

    Gupta, A.; Schachne, A.; Condit, C.; Valentine, D.; Richard, S.; Zaslavsky, I.

    2015-12-01

    The CINERGI (Community Inventory of EarthCube Resources for Geosciences Interoperability) project compiles an inventory of a wide variety of earth science resources including documents, catalogs, vocabularies, data models, data services, process models, information repositories, domain-specific ontologies etc. developed by research groups and data practitioners. We have developed a multidisciplinary semantic framework called GeoSciGraph semantic ingration of earth science resources. An integrated ontology is constructed with Basic Formal Ontology (BFO) as its upper ontology and currently ingests multiple component ontologies including the SWEET ontology, GeoSciML's lithology ontology, Tematres controlled vocabulary server, GeoNames, GCMD vocabularies on equipment, platforms and institutions, software ontology, CUAHSI hydrology vocabulary, the environmental ontology (ENVO) and several more. These ontologies are connected through bridging axioms; GeoSciGraph identifies lexically close terms and creates equivalence class or subclass relationships between them after human verification. GeoSciGraph allows a community to create community-specific customizations of the integrated ontology. GeoSciGraph uses the Neo4J,a graph database that can hold several billion concepts and relationships. GeoSciGraph provides a number of REST services that can be called by other software modules like the CINERGI information augmentation pipeline. 1) Vocabulary services are used to find exact and approximate terms, term categories (community-provided clusters of terms e.g., measurement-related terms or environmental material related terms), synonyms, term definitions and annotations. 2) Lexical services are used for text parsing to find entities, which can then be included into the ontology by a domain expert. 3) Graph services provide the ability to perform traversal centric operations e.g., finding paths and neighborhoods which can be used to perform ontological operations like computing transitive closure (e.g., finding all subclasses of rocks). 4) Annotation services are used to adorn an arbitrary block of text (e.g., from a NOAA catalog record) with ontology terms. The system has been used to ontologically integrate diverse sources like Science-base, NOAA records, PETDB.

  8. Controlled Vocabularies and Ontologies for Oceanographic Data: The R2R Eventlogger Project

    NASA Astrophysics Data System (ADS)

    Coburn, E.; Maffei, A. R.; Chandler, C. L.; Raymond, L. M.

    2012-12-01

    Research vessels coordinated by the United States University-National Oceanographic Laboratory System (US-UNOLS) collect data which is considered an important oceanographic resource. The NSF-funded Rolling Deck to Repository (R2R) project aims to improve access to this data and diminish the barriers to use. One aspect of the R2R project has been to develop a shipboard scientific event logging system, Eventlogger, that incorporates best practice guidelines, controlled vocabularies, a cruise metadata schema, and a scientific event log. This will facilitate the eventual ingestion of datasets into oceanographic data repositories for subsequent integration and synthesis by investigators. One important aspect of this system is the careful use of controlled vocabularies and ontologies. Existing ontologies, where available, will be used and others will be developed. The use of internationally-informed, consensus-driven controlled vocabularies will make datasets more interoperable, and discoverable. The R2R Eventlogger project is led by Woods Hole Oceanographic Institution (WHOI), and the management of the controlled vocabularies and mapping of these vocabularies to authoritative community vocabularies are led by the Data Librarian in the Marine Biological Laboratory/Woods Hole Oceanographic Institution (MBLWHOI) Library. The first target vocabulary is oceanographic instruments. Management of this vocabulary has thus far consisted of reconciling local community terms with the more widely used SeaDataNet Device Vocabulary terms. Rather than adopt existing terms, often the local terms are mapped by data managers in the NSF-funded Biological and Chemical Oceanographic Data Management Office (BCO-DMO) to the existing terms as they are given by investigators and often provide important information and meaning. New terms (often custom, or modified instruments) are submitted for review to the SeaDataNet community listserv for discussion and eventual incorporation into the Device Vocabulary. These vocabularies and their mappings are an important part of the Eventlogger system. Before a research cruise investigators configure the instruments they intend to use for science activities. The instruments available for selection are pulled directly from the instrument vocabulary. The promotion and use of controlled vocabularies and ontologies will pave the way for linked data. By mapping local terms to agreed upon authoritative terms links are created, whereby related datasets can be discovered, and utilized. The Library is a natural home for the management of standards. Librarians have an established history of working with controlled vocabularies and metadata and libraries serve as centers for information discovery. Eventlogger is currently being tested across the UNOLS fleet. A large submission of suggested instrument terms to the SeaDataNet community listserv is in progress. References: Maffei, Andrew R., Cynthia L. Chandler, Janet Fredericks, Nan Galbraith, Laura Stolp. Rolling Deck to Repository (R2R): A Controlled Vocabulary and Ontology Development Effort for Oceanographic Research Cruise Event Logging. EGU2011-12341. Poster presented at the 2011 EGU Meeting.

  9. Determination of In-situ Porosity and Investigation of Diffusion Processes at the Grimsel Test Site, Switzerland.

    NASA Astrophysics Data System (ADS)

    Biggin, C.; Ota, K.; Siittari-Kauppi, M.; Moeri, A.

    2004-12-01

    In the context of a repository for radioactive waste, 'matrix diffusion' is used to describe the process by which solute, flowing in distinct flow paths, penetrates the surrounding rock matrix. Diffusion into the matrix occurs in a connected system of pores or microfractures. Matrix diffusion provides a mechanism for greatly enlarging the area of rock surface in contact with advecting radionuclides, from that of the flow path surfaces (and infills), to a much larger portion of the bulk rock and increases the global pore volume which can retard radionuclides. In terms of a repository safety assessment, demonstration of a significant depth of diffusion-accessible pore space may result in a significant delay in the calculated release of any escaping radionuclides to the environment and a dramatic reduction in the resulting concentration released into the biosphere. For the last decade, Nagra has investigated in situ matrix diffusion at the Grimsel Test Site (GTS) in the Swiss Alps. The in situ investigations offer two distinct advantages to those performed in the lab, namely: 1. Lab-based determination of porosity and diffusivity can lead to an overestimation of matrix diffusion due to stress relief when the rock is sampled (which would overestimate the retardation in the geosphere) 2. Lab-based analysis usually examines small (cm scale) samples and cannot therefore account for any matrix heterogeneity over the hundreds or thousands of metres a typical flow path The in situ investigations described began with the Connected Porosity project, wherein a specially developed acrylic resin was injected into the rock matrix to fill the pore space and determine the depth of connected porosity. The resin was polymerised in situ and the entire rock mass removed by overcoring. The results indicated that lab-based porosity measurements may be two to three times higher than those obtained in situ. While the depth of accessible matrix from a water-conducting feature assumed in repository performance assessments is generally 1 to 10 cm, the results from the GTS in situ experiment suggested depths of several metres could be more appropriate. More recently, the Pore Space Geometry (PSG) experiment at the GTS has used a C-14 doped acrylic resin, combined with state-of-the-art digital beta autoradiography and fluorescence detection to examine a larger area of rock for determination of porosity and the degree of connected pore space. Analysis is currently ongoing and the key findings will be reported in this paper. Starting at the GTS in 2005, the Long-term Diffusion (LTD) project will investigate such processes over spatial and temporal scales more relevant to a repository than traditional lab-based experiments. In the framework of this experiment, long-term (10 to 50 years) in situ diffusion experiments and resin injection experiments are planned to verify current models for matrix diffusion as a radionuclide retardation process. This paper will discuss the findings of the first two experiments and their significance to repository safety assessments before discussing the strategy for the future in relation to the LTD project.

  10. [Access to health information sources in Spain. how to combat "infoxication"].

    PubMed

    Navas-Martin, Miguel Ángel; Albornos-Muñoz, Laura; Escandell-García, Cintia

    2012-01-01

    Internet has become a priceless source for finding health information for both patients and healthcare professionals. However, the universality and the abundance of information can lead to unfounded conclusions about health issues that can confuse further than clarify the health information. This aspect causes intoxication of information: infoxication. The question lies in knowing how to filter the information that is useful, accurate and relevant for our purposes. In this regard, integrative portals, such as the Biblioteca Virtual de Salud, compile information at different levels (international, national and regional), different types of resources (databases, repositories, bibliographic sources, etc.), becoming a starting point for obtaining quality information. Copyright © 2011 Elsevier España, S.L. All rights reserved.

  11. Long-term diffusion of U(VI) in bentonite: Dependence on density

    DOE PAGES

    Joseph, Claudia; Mibus, Jens; Trepte, Paul; ...

    2016-10-12

    As a contribution to the safety assessment of nuclear waste repositories, U(VI) diffusion through the potential buffer material MX-80 bentonite was investigated at three clay dry densities over six years. Synthetic MX-80 model pore water was used as background electrolyte. Speciation calculations showed that Ca 2UO 2(CO 3) 3(aq) was the main U(VI) species. The in- and out-diffusion of U(VI) was investigated separately. U(VI) diffused about 3 mm, 1.5 mm, and 1 mm into the clay plug at ρ = 1.3, 1.6, and 1.9 g/cm 3, respectively. No through-diffusion of the U(VI) tracer was observed. However, leaching of natural uraniummore » contained in the clay occurred and uranium was detected in all receiving reservoirs. As expected, the effective and apparent diffusion coefficients, D e and D a, decreased with increasing dry density. The D a values for the out-diffusion of natural U(VI) were in good agreement with previously determined values. Surprisingly, D a values for the in-diffusion of U(VI) were about two orders of magnitude lower than values obtained in short-term in-diffusion experiments reported in the literature. Some potential reasons for this behavior that were evaluated are changes of the U(VI) speciation within the clay (precipitation, reduction) or changes of the clay porosity and pore connectivity with time. By applying Archie's law and the extended Archie's law, it was estimated that a significantly smaller effective porosity must be present for the long-term in-diffusion of U(VI). Finally, the results suggest that long-term studies of key transport phenomena may reveal additional processes that can directly impact long-term repository safety assessments.« less

  12. 17 CFR 49.19 - Core principles applicable to registered swap data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with core principles. To be registered, and maintain...

  13. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Disclosure requirements of...

  14. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Disclosure requirements of...

  15. 21 CFR 522.480 - Repository corticotropin injection.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Repository corticotropin injection. 522.480 Section 522.480 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... § 522.480 Repository corticotropin injection. (a)(1) Specifications. The drug conforms to repository...

  16. 10 CFR 960.3-1-3 - Regionality.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-1-3 Regionality. In making site recommendations for repository development after the site for the first repository has been recommended, the Secretary shall give due... repositories. Such consideration shall take into account the proximity of sites to locations at which waste is...

  17. Depleted uranium as a backfill for nuclear fuel waste package

    DOEpatents

    Forsberg, Charles W.

    1998-01-01

    A method for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotopically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package.

  18. M4SF-17LL010301071: Thermodynamic Database Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavarin, M.; Wolery, T. J.

    2017-09-05

    This progress report (Level 4 Milestone Number M4SF-17LL010301071) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number M4SF-17LL01030107. The DR Argillite Disposal R&D control account is focused on the evaluation of important processes in the analysis of disposal design concepts and related materials for nuclear fuel disposal in clay-bearing repository media. The objectives of this work package are to develop model tools for evaluating impacts of THMC process on long-term disposal of spent fuel in argillite rocks, and to establish the scientific basis for high thermal limits. This work is contributing tomore » the GDSA model activities to identify gaps, develop process models, provide parameter feeds and support requirements providing the capability for a robust repository performance assessment model by 2020.« less

  19. Evolutions in Metadata Quality

    NASA Astrophysics Data System (ADS)

    Gilman, J.

    2016-12-01

    Metadata Quality is one of the chief drivers of discovery and use of NASA EOSDIS (Earth Observing System Data and Information System) data. Issues with metadata such as lack of completeness, inconsistency, and use of legacy terms directly hinder data use. As the central metadata repository for NASA Earth Science data, the Common Metadata Repository (CMR) has a responsibility to its users to ensure the quality of CMR search results. This talk will cover how we encourage metadata authors to improve the metadata through the use of integrated rubrics of metadata quality and outreach efforts. In addition we'll demonstrate Humanizers, a technique for dealing with the symptoms of metadata issues. Humanizers allow CMR administrators to identify specific metadata issues that are fixed at runtime when the data is indexed. An example Humanizer is the aliasing of processing level "Level 1" to "1" to improve consistency across collections. The CMR currently indexes 35K collections and 300M granules.

  20. Probalistic Criticality Consequence Evaluation (SCPB:N/A)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Gottlieb; J.W. Davis; J.R. Massari

    1996-09-04

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department with the objective of providing a comprehensive, conservative estimate of the consequences of the criticality which could possibly occur as the result of commercial spent nuclear fuel emplaced in the underground repository at Yucca Mountain. The consequences of criticality are measured principally in terms of the resulting changes in radionuclide inventory as a function of the power level and duration of the criticality. The purpose of this analysis is to extend the prior estimates of increased radionuclide inventory (Refs. 5.52 and 5.54), for bothmore » internal and external criticality. This analysis, and similar estimates and refinements to be completed before the end of fiscal year 1997, will be provided as input to Total System Performance Assessment-Viability Assessment (TSPA-VA) to demonstrate compliance with the repository performance objectives.« less

Top