Sample records for repository simulation experiments

  1. WFIRST: Data/Instrument Simulation Support at IPAC

    NASA Astrophysics Data System (ADS)

    Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

  2. Experimental and numerical simulation of dissolution and precipitation: implications for fracture sealing at Yucca Mountain, Nevada

    NASA Astrophysics Data System (ADS)

    Dobson, Patrick F.; Kneafsey, Timothy J.; Sonnenthal, Eric L.; Spycher, Nicolas; Apps, John A.

    2003-05-01

    Plugging of flow paths caused by mineral precipitation in fractures above the potential repository at Yucca Mountain, Nevada could reduce the probability of water seeping into the repository. As part of an ongoing effort to evaluate thermal-hydrological-chemical (THC) effects on flow in fractured media, we performed a laboratory experiment and numerical simulations to investigate mineral dissolution and precipitation under anticipated temperature and pressure conditions in the repository. To replicate mineral dissolution by vapor condensate in fractured tuff, water was flowed through crushed Yucca Mountain tuff at 94 °C. The resulting steady-state fluid composition had a total dissolved solids content of about 140 mg/l; silica was the dominant dissolved constituent. A portion of the steady-state mineralized water was flowed into a vertically oriented planar fracture in a block of welded Topopah Spring Tuff that was maintained at 80 °C at the top and 130 °C at the bottom. The fracture began to seal with amorphous silica within 5 days. A 1-D plug-flow numerical model was used to simulate mineral dissolution, and a similar model was developed to simulate the flow of mineralized water through a planar fracture, where boiling conditions led to mineral precipitation. Predicted concentrations of the major dissolved constituents for the tuff dissolution were within a factor of 2 of the measured average steady-state compositions. The mineral precipitation simulations predicted the precipitation of amorphous silica at the base of the boiling front, leading to a greater than 50-fold decrease in fracture permeability in 5 days, consistent with the laboratory experiment. These results help validate the use of a numerical model to simulate THC processes at Yucca Mountain. The experiment and simulations indicated that boiling and concomitant precipitation of amorphous silica could cause significant reductions in fracture porosity and permeability on a local scale. However, differences in fluid flow rates and thermal gradients between the experimental setup and anticipated conditions at Yucca Mountain need to be factored into scaling the results of the dissolution/precipitation experiments and associated simulations to THC models for the potential Yucca Mountain repository.

  3. Combining computational models, semantic annotations and simulation experiments in a graph database

    PubMed Central

    Henkel, Ron; Wolkenhauer, Olaf; Waltemath, Dagmar

    2015-01-01

    Model repositories such as the BioModels Database, the CellML Model Repository or JWS Online are frequently accessed to retrieve computational models of biological systems. However, their storage concepts support only restricted types of queries and not all data inside the repositories can be retrieved. In this article we present a storage concept that meets this challenge. It grounds on a graph database, reflects the models’ structure, incorporates semantic annotations and simulation descriptions and ultimately connects different types of model-related data. The connections between heterogeneous model-related data and bio-ontologies enable efficient search via biological facts and grant access to new model features. The introduced concept notably improves the access of computational models and associated simulations in a model repository. This has positive effects on tasks such as model search, retrieval, ranking, matching and filtering. Furthermore, our work for the first time enables CellML- and Systems Biology Markup Language-encoded models to be effectively maintained in one database. We show how these models can be linked via annotations and queried. Database URL: https://sems.uni-rostock.de/projects/masymos/ PMID:25754863

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leigh, Christi D.; Hansen, Francis D.

    This report summarizes the state of salt repository science, reviews many of the technical issues pertaining to disposal of heat-generating nuclear waste in salt, and proposes several avenues for future science-based activities to further the technical basis for disposal in salt. There are extensive salt formations in the forty-eight contiguous states, and many of them may be worthy of consideration for nuclear waste disposal. The United States has extensive experience in salt repository sciences, including an operating facility for disposal of transuranic wastes. The scientific background for salt disposal including laboratory and field tests at ambient and elevated temperature, principlesmore » of salt behavior, potential for fracture damage and its mitigation, seal systems, chemical conditions, advanced modeling capabilities and near-future developments, performance assessment processes, and international collaboration are all discussed. The discussion of salt disposal issues is brought current, including a summary of recent international workshops dedicated to high-level waste disposal in salt. Lessons learned from Sandia National Laboratories' experience on the Waste Isolation Pilot Plant and the Yucca Mountain Project as well as related salt experience with the Strategic Petroleum Reserve are applied in this assessment. Disposal of heat-generating nuclear waste in a suitable salt formation is attractive because the material is essentially impermeable, self-sealing, and thermally conductive. Conditions are chemically beneficial, and a significant experience base exists in understanding this environment. Within the period of institutional control, overburden pressure will seal fractures and provide a repository setting that limits radionuclide movement. A salt repository could potentially achieve total containment, with no releases to the environment in undisturbed scenarios for as long as the region is geologically stable. Much of the experience gained from United States repository development, such as seal system design, coupled process simulation, and application of performance assessment methodology, helps define a clear strategy for a heat-generating nuclear waste repository in salt.« less

  5. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  6. A laboratory validation study of the time-lapse oscillatory pumping test for leakage detection in geological repositories

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Lu, Jiemin; Islam, Akand

    2017-05-01

    Geologic repositories are extensively used for disposing byproducts in mineral and energy industries. The safety and reliability of these repositories are a primary concern to environmental regulators and the public. Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, an operator may identify the potential repository anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures. The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results are further analyzed by developing a 3D flow model, using which the model parameters are estimated through frequency domain inversion.

  7. Damage-plasticity model of the host rock in a nuclear waste repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koudelka, Tomáš; Kruis, Jaroslav, E-mail: kruis@fsv.cvut.cz

    The paper describes damage-plasticity model for the modelling of the host rock environment of a nuclear waste repository. Radioactive Waste Repository Authority in Czech Republic assumes the repository to be in a granite rock mass which exhibit anisotropic behaviour where the strength in tension is lower than in compression. In order to describe this phenomenon, the damage-plasticity model is formulated with the help of the Drucker-Prager yield criterion which can be set to capture the compression behaviour while the tensile stress states is described with the help of scalar isotropic damage model. The concept of damage-plasticity model was implemented inmore » the SIFEL finite element code and consequently, the code was used for the simulation of the Äspö Pillar Stability Experiment (APSE) which was performed in order to determine yielding strength under various conditions in similar granite rocks as in Czech Republic. The results from the performed analysis are presented and discussed in the paper.« less

  8. Physico-chemical interactions at the concrete-bitumen interface of nuclear waste repositories

    NASA Astrophysics Data System (ADS)

    Bertron, A.; Ranaivomanana, H.; Jacquemet, N.; Erable, B.; Sablayrolles, C.; Escadeillas, G.; Albrecht, A.

    2013-07-01

    This study investigates the fate of nitrate and organic acids at the bitumenconcrete-steel interface within a repository storage cell for long-lived, intermediatelevel, radioactive wastes. The interface was simulated by a multiphase system in which cementitious matrices (CEM V-paste specimens) were exposed to bitumen model leachates consisting of nitrates and acetic acid with and without oxalic acid, chemical compounds likely to be released by bitumen. Leaching experiments were conducted with daily renewal of the solutions in order to accelerate reactions. C-steel chips, simulating the presence of steel in the repository, were added in the systems for some experiments. The concentrations of anions (acetate, oxalate, nitrate, and nitrite) and cations (calcium, potassium, ammonium) and the pH were monitored over time. Mineralogical changes of the cementitious matrices were analysed by XRD. The results confirmed the stability of nitrates in the absence of steel, whereas, reduction of nitrates was observed in the presence of steel (production of NH4+). The action of acetic acid on the cementitious matrix was similar to that of ordinary leaching; no specific interaction was detected between acetate and cementitious cations. The reaction of oxalic acid with the cementitious phases led to the precipitation of calcium oxalate salts in the outer layer of the matrix. The concentration of oxalate was reduced by 65% inside the leaching medium.

  9. Development of anomaly detection models for deep subsurface monitoring

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  10. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  11. Coupled Multi-physical Simulations for the Assessment of Nuclear Waste Repository Concepts: Modeling, Software Development and Simulation

    NASA Astrophysics Data System (ADS)

    Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.

    2016-12-01

    As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.

  12. Uranium (VI) solubility in carbonate-free ERDA-6 brine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucchini, Jean-francois; Khaing, Hnin; Reed, Donald T

    2010-01-01

    When present, uranium is usually an element of importance in a nuclear waste repository. In the Waste Isolation Pilot Plant (WIPP), uranium is the most prevalent actinide component by mass, with about 647 metric tons to be placed in the repository. Therefore, the chemistry of uranium, and especially its solubility in the WIPP conditions, needs to be well determined. Long-term experiments were performed to measure the solubility of uranium (VI) in carbonate-free ERDA-6 brine, a simulated WIPP brine, at pC{sub H+} values between 8 and 12.5. These data, obtained from the over-saturation approach, were the first repository-relevant data for themore » VI actinide oxidation state. The solubility trends observed pointed towards low uranium solubility in WIPP brines and a lack of amphotericity. At the expected pC{sub H+} in the WIPP ({approx} 9.5), measured uranium solubility approached 10{sup -7} M. The objective of these experiments was to establish a baseline solubility to further investigate the effects of carbonate complexation on uranium solubility in WIPP brines.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choung, Sungwook; Um, Wooyong; Pacific Northwest National Laboratory

    Permanent disposal of low- and intermediate-level radioactive wastes in the subterranean environment has been the preferred method of many countries, including Korea. A safety issue after the closure of a geological repository is that biodegradation of organic materials due to microbial activities generates gases that lead to overpressure of the waste containers in the repository and its disintegration with the release of radionuclides. As part of an ongoing large-scale in situ experiment using organic wastes and groundwater to simulate geological radioactive waste repository conditions, we investigated the geochemical alteration and microbial activities at an early stage (~63 days) intended tomore » be representative of the initial period after repository closure. The increased numbers of both aerobes and facultative anaerobes in waste effluents indicate that oxygen content could be the most significant parameter to control biogeochemical conditions at very early periods of reaction (<35 days). Accordingly, the values of dissolved oxygen and redox potential were decreased. The activation of anaerobes after 35 days was supported by the increased concentration to ~50 mg L-1 of ethanol. These results suggest that the biogeochemical conditions were rapidly altered to more reducing and anaerobic conditions within the initial 2 months after repository closure. Although no gases were detected during the study, activated anaerobic microbes will play more important role in gas generation over the long term.« less

  14. Software aspects of the Geant4 validation repository

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto

    2017-10-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  15. Software Aspects of the Geant4 Validation Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel

    2016-01-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  16. Thermo-hydrological and chemical (THC) modeling to support Field Test Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stauffer, Philip H.; Jordan, Amy B.; Harp, Dylan Robert

    This report summarizes ongoing efforts to simulate coupled thermal-hydrological-chemical (THC) processes occurring within a hypothetical high-level waste (HLW) repository in bedded salt. The report includes work completed since the last project deliverable, “Coupled model for heat and water transport in a high level waste repository in salt”, a Level 2 milestone submitted to DOE in September 2013 (Stauffer et al., 2013). Since the last deliverable, there have been code updates to improve the integration of the salt module with the pre-existing code and development of quality assurance (QA) tests of constitutive functions and precipitation/dissolution reactions. Simulations of bench-scale experiments, bothmore » historical and currently in the planning stages have been performed. Additional simulations have also been performed on the drift-scale model that incorporate new processes, such as an evaporation function to estimate water vapor removal from the crushed salt backfill and isotopic fractionation of water isotopes. Finally, a draft of a journal paper on the importance of clay dehydration on water availability is included as Appendix I.« less

  17. Simulated effects of increased recharge on the ground-water flow system of Yucca Mountain and vicinity, Nevada-California

    USGS Publications Warehouse

    Czarnecki, J.B.

    1984-01-01

    A study was performed to assess the potential effects of changes in future climatic conditions on the groundwater system in the vicinity of Yucca Mountain, the site of a potential mined geologic repository for high-level nuclear wastes. These changes probably would result in greater rates of precipitation and, consequently, greater rates of recharge. The study was performed by simulating the groundwater system, using a two-dimensional, finite-element, groundwater flow model. The simulated position of the water table rose as much as 130 meters near the U.S. Department of Energy 's preferred repository area at Yucca Mountain for a simulation involving a 100-percent increase in precipitation compared to modern-day conditions. Despite the water table rise, no flooding of the potential repository would occur at its current proposed location. According to the simulation, springs would discharge south and west of Timber Mountain, along Fortymile Canyon, in the Amargosa Desert near Lathrop Wells and Franklin Lake playa, and near Furnace Creek Ranch in Death Valley, where they presently discharge. Simulated directions of groundwater flow paths near the potential repository area generally would be the same for the baseline (modern-day climate) and the increased-recharge simulations, but the magnitude of flow would increase by 2 to 4 times that of the baseline-simulation flow. (USGS)

  18. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following authorization to receive and possess source, special nuclear, or byproduct material at a geologic repository operations area, the DOE may (i) make changes in the geologic repository operations area as described in the...

  19. Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Gomez, S. P.; Matteo, E. N.

    2017-12-01

    Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.

  20. 3D numerical modelling of the thermal state of deep geological nuclear waste repositories

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, Yu. N.

    2017-09-01

    One of the important aspects of the high-level radioactive waste (HLW) disposal in deep geological repositories is ensuring the integrity of the engineered barriers which is, among other phenomena, considerably influenced by the thermal loads. As the HLW produce significant amount of heat, the design of the repository should maintain the balance between the cost-effectiveness of the construction and the sufficiency of the safety margins, including those imposed on the thermal conditions of the barriers. The 3D finite-element computer code FENIA was developed as a tool for simulation of thermal processes in deep geological repositories. Further the models for mechanical phenomena and groundwater hydraulics will be added resulting in a fully coupled thermo-hydro-mechanical (THM) solution. The long-term simulations of the thermal state were performed for two possible layouts of the repository. One was based on the proposed project of Russian repository, and another features larger HLW amount within the same space. The obtained results describe the spatial and temporal evolution of the temperature filed inside the repository and in the surrounding rock for 3500 years. These results show that practically all generated heat was ultimately absorbed by the host rock without any significant temperature increase. Still in the short time span even in case of smaller amount of the HLW the temperature maximum exceeds 100 °C, and for larger amount of the HLW the local temperature remains above 100 °C for considerable time. Thus, the substantiation of the long-term stability of the repository would require an extensive study of the materials properties and behaviour in order to remove the excessive conservatism from the simulations and to reduce the uncertainty of the input data.

  1. Evaluation of methods for measuring relative permeability of anhydride from the Salado Formation: Sensitivity analysis and data reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christiansen, R.L.; Kalbus, J.S.; Howarth, S.M.

    This report documents, demonstrates, evaluates, and provides theoretical justification for methods used to convert experimental data into relative permeability relationships. The report facilities accurate determination of relative permeabilities of anhydride rock samples from the Salado Formation at the Waste Isolation Pilot Plant (WIPP). Relative permeability characteristic curves are necessary for WIPP Performance Assessment (PA) predictions of the potential for flow of waste-generated gas from the repository and brine flow into repository. This report follows Christiansen and Howarth (1995), a comprehensive literature review of methods for measuring relative permeability. It focuses on unsteady-state experiments and describes five methods for obtaining relativemore » permeability relationships from unsteady-state experiments. Unsteady-state experimental methods were recommended for relative permeability measurements of low-permeability anhydrite rock samples form the Salado Formation because these tests produce accurate relative permeability information and take significantly less time to complete than steady-state tests. Five methods for obtaining relative permeability relationships from unsteady-state experiments are described: the Welge method, the Johnson-Bossler-Naumann method, the Jones-Roszelle method, the Ramakrishnan-Cappiello method, and the Hagoort method. A summary, an example of the calculations, and a theoretical justification are provided for each of the five methods. Displacements in porous media are numerically simulated for the calculation examples. The simulated product data were processed using the methods, and the relative permeabilities obtained were compared with those input to the numerical model. A variety of operating conditions were simulated to show sensitivity of production behavior to rock-fluid properties.« less

  2. International Collaboration Activities on Engineered Barrier Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jove-Colon, Carlos F.

    The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less

  3. Results of instrument reliability study for high-level nuclear-waste repositories. [Geotechnical parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogue, F.; Binnall, E.P.

    1982-10-01

    Reliable instrumentation will be needed to monitor the performance of future high-level waste repository sites. A study has been made to assess instrument reliability at Department of Energy (DOE) waste repository related experiments. Though the study covers a wide variety of instrumentation, this paper concentrates on experiences with geotechnical instrumentation in hostile repository-type environments. Manufacturers have made some changes to improve the reliability of instruments for repositories. This paper reviews the failure modes, rates, and mechanisms, along with manufacturer modifications and recommendations for additional improvements to enhance instrument performance. 4 tables.

  4. Coupling Effects of Heat and Moisture on the Saturation Processes of Buffer Material in a Deep Geological Repository

    NASA Astrophysics Data System (ADS)

    Huang, Wei-Hsing

    2017-04-01

    Clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation behavior of clay barrier, with emphasis on the coupling effects of heat and moisture of buffer material in the near-field of a repository during groundwater intrusion processes. A locally available clay named "Zhisin clay" and a standard bentotine material were adopted in the laboratory program. Water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the buffer material. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. The finite element program ABAQUS was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on Zhisin clay. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. It was found that, due to the variation in suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.

  5. Temperature-package power correlations for open-mode geologic disposal concepts.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest.

    2013-02-01

    Logistical simulation of spent nuclear fuel (SNF) management in the U.S. combines storage, transportation and disposal elements to evaluate schedule, cost and other resources needed for all major operations leading to final geologic disposal. Geologic repository reference options are associated with limits on waste package thermal power output at emplacement, in order to meet limits on peak temperature for certain key engineered and natural barriers. These package power limits are used in logistical simulation software such as CALVIN, as threshold requirements that must be met by means of decay storage or SNF blending in waste packages, before emplacement in amore » repository. Geologic repository reference options include enclosed modes developed for crystalline rock, clay or shale, and salt. In addition, a further need has been addressed for open modes in which SNF can be emplaced in a repository, then ventilated for decades or longer to remove heat, prior to permanent repository closure. For each open mode disposal concept there are specified durations for surface decay storage (prior to emplacement), repository ventilation, and repository closure operations. This study simulates those steps for several timing cases, and for SNF with three fuel-burnup characteristics, to develop package power limits at which waste packages can be emplaced without exceeding specified temperature limits many years later after permanent closure. The results are presented in the form of correlations that span a range of package power and peak postclosure temperature, for each open-mode disposal concept, and for each timing case. Given a particular temperature limit value, the corresponding package power limit for each case can be selected for use in CALVIN and similar tools.« less

  6. A laboratory validation study of the time-lapse oscillatory pumping test concept for leakage detection in geological repositories

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Islam, A.; Lu, J.

    2017-12-01

    Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based monitoring technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, a site operator may identify the potential anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures ( 120psi). The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results were further analyzed by developing a 3D flow model, using which the model parameters were estimated through frequency domain inversion.

  7. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    PubMed

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  8. Thermal - Hydraulic Behavior of Unsaturated Bentonite and Sand-Bentonite Material as Seal for Nuclear Waste Repository: Numerical Simulation of Column Experiments

    NASA Astrophysics Data System (ADS)

    Ballarini, E.; Graupner, B.; Bauer, S.

    2015-12-01

    For deep geological repositories of high-level radioactive waste (HLRW), bentonite and sand bentonite mixtures are investigated as buffer materials to form a a sealing layer. This sealing layer surrounds the canisters and experiences an initial drying due to the heat produced by HLRW and a successive re-saturation with fluid from the host rock. These complex thermal, hydraulic and mechanical processes interact and were investigated in laboratory column experiments using MX-80 clay pellets as well as a mixture of 35% sand and 65% bentonite. The aim of this study is to both understand the individual processes taking place in the buffer materials and to identify the key physical parameters that determine the material behavior under heating and hydrating conditions. For this end, detailed and process-oriented numerical modelling was applied to the experiments, simulating heat transport, multiphase flow and mechanical effects from swelling. For both columns, the same set of parameters was assigned to the experimental set-up (i.e. insulation, heater and hydration system), while the parameters of the buffer material were adapted during model calibration. A good fit between model results and data was achieved for temperature, relative humidity, water intake and swelling pressure, thus explaining the material behavior. The key variables identified by the model are the permeability and relative permeability, the water retention curve and the thermal conductivity of the buffer material. The different hydraulic and thermal behavior of the two buffer materials observed in the laboratory observations was well reproduced by the numerical model.

  9. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  10. DoSSiER: Database of scientific simulation and experimental results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  11. DoSSiER: Database of scientific simulation and experimental results

    DOE PAGES

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...

    2016-08-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  12. Use of in-vitro experimental results to model in-situ experiments: bio-denitrification under geological disposal conditions.

    PubMed

    Masuda, Kaoru; Murakami, Hiroshi; Kurimoto, Yoshitaka; Kato, Osamu; Kato, Ko; Honda, Akira

    2013-01-01

    Some of the low level radioactive wastes from reprocessing of spent nuclear fuels contain nitrates. Nitrates can be present in the form of soluble salts and can be reduced by various reactions. Among them, reduction by metal compounds and microorganisms seems to be important in the underground repository. Reduction by microorganism is more important in near field area than inside the repository because high pH and extremely high salt concentration would prevent microorganism activities. In the near field, pH is more moderate (pH is around 8) and salt concentration is lower. However, the electron donor may be limited there and it might be the control factor for microorganism's denitrification activities. In this study, in-vitro experiments of the nitrate reduction reaction were conducted using model organic materials purported to exist in underground conditions relevant to geological disposal. Two kinds of organic materials were selected. A super plasticizer was selected as being representative of the geological disposal system and humic acid was selected as being representative of pre-existing organic materials in the bedrock. Nitrates were reduced almost to N2 gas in the existence of super plasticizer. In the case of humic acids, although nitrates were reduced, the rate was much lower and, in this case, dead organism was used as an electron donor instead of humic acids. A reaction model was developed based on the in-vitro experiments and verified by running simulations against data obtained from in-situ experiments using actual groundwaters and microorganisms. The simulation showed a good correlation with the experimental data and contributes to the understanding of microbially mediated denitrification in geological disposal systems.

  13. Simulation of fluid flow and energy transport processes associated with high-level radioactive waste disposal in unsaturated alluvium

    USGS Publications Warehouse

    Pollock, David W.

    1986-01-01

    Many parts of the Great Basin have thick zones of unsaturated alluvium which might be suitable for disposing of high-level radioactive wastes. A mathematical model accounting for the coupled transport of energy, water (vapor and liquid), and dry air was used to analyze one-dimensional, vertical transport above and below an areally extensive repository. Numerical simulations were conducted for a hypothetical repository containing spent nuclear fuel and located 100 m below land surface. Initial steady state downward water fluxes of zero (hydrostatic) and 0.0003 m yr−1were considered in an attempt to bracket the likely range in natural water flux. Predicted temperatures within the repository peaked after approximately 50 years and declined slowly thereafter in response to the decreasing intensity of the radioactive heat source. The alluvium near the repository experienced a cycle of drying and rewetting in both cases. The extent of the dry zone was strongly controlled by the mobility of liquid water near the repository under natural conditions. In the case of initial hydrostatic conditions, the dry zone extended approximately 10 m above and 15 m below the repository. For the case of a natural flux of 0.0003 m yr−1 the relative permeability of water near the repository was initially more than 30 times the value under hydrostatic conditions, consequently the dry zone extended only about 2 m above and 5 m below the repository. In both cases a significant perturbation in liquid saturation levels persisted for several hundred years. This analysis illustrates the extreme sensitivity of model predictions to initial conditions and parameters, such as relative permeability and moisture characteristic curves, that are often poorly known.

  14. Large-scale Thermo-Hydro-Mechanical Simulations in Complex Geological Environments

    NASA Astrophysics Data System (ADS)

    Therrien, R.; Lemieux, J.

    2011-12-01

    The study of a potential deep repository for radioative waste disposal in Canada context requires simulation capabilities for thermo-hydro-mechanical processes. It is expected that the host rock for the deep repository will be subjected to a variety of stresses during its lifetime such as in situ stresses in the rock, stressed caused by excavation of the repository and thermo-mechanical stresses. Another stress of concern for future Canadian climates will results from various episodes of glaciation. In that case, it can be expected that over 3 km of ice may be present over the land mass, which will create a glacial load that will be transmitted to the underlying geological materials and therefore impact their mechanical and hydraulic responses. Glacial loading will affect pore fluid pressures in the subsurface, which will in turn affect groundwater velocities and the potential migration of radionuclides from the repository. In addition, permafrost formation and thawing resulting from glacial advance and retreat will modify the bulk hydraulic of the geological materials and will have a potentially large impact on groundwater flow patterns, especially groundwater recharge. In the context of a deep geological repository for spent nuclear fuel, the performance of the repository to contain the spent nuclear fuel must be evaluated for periods that span several hundred thousand years. The time-frame for thermo-hydro-mechanical simulations is therefore extremely long and efficient numerical techniques must be developed. Other challenges are the representation of geological formations that have potentially complex geometries and physical properties and may contain fractures. The spatial extent of the simulation domain is also very large and can potentially reach the size of a sedimentary basin. Mass transport must also be considered because the fluid salinity in a sedimentary basin can be highly variable and the effect of fluid density on groundwater flow must be accounted for. Adding mass transport with density effect introduces further non-linearities in the governing equations, thus leading to increased simulation times. We will present challenges and current developments related to this topic in the Canadian context. Current efforts aim at improving simulation capabilities for large-scale 3D thermo-hydro-mechanical simulation in complex geologic materials. One topic of interest is to evaluate the appropriateness of simplifying the effect of glacial loading by using a one-dimensional hydro-mechanical representation that assumes purely vertical strain as opposed to the much more computationally intensive 3D representation.

  15. Dynameomics: design of a computational lab workflow and scientific data repository for protein simulations.

    PubMed

    Simms, Andrew M; Toofanny, Rudesh D; Kehl, Catherine; Benson, Noah C; Daggett, Valerie

    2008-06-01

    Dynameomics is a project to investigate and catalog the native-state dynamics and thermal unfolding pathways of representatives of all protein folds using solvated molecular dynamics simulations, as described in the preceding paper. Here we introduce the design of the molecular dynamics data warehouse, a scalable, reliable repository that houses simulation data that vastly simplifies management and access. In the succeeding paper, we describe the development of a complementary multidimensional database. A single protein unfolding or native-state simulation can take weeks to months to complete, and produces gigabytes of coordinate and analysis data. Mining information from over 3000 completed simulations is complicated and time-consuming. Even the simplest queries involve writing intricate programs that must be built from low-level file system access primitives and include significant logic to correctly locate and parse data of interest. As a result, programs to answer questions that require data from hundreds of simulations are very difficult to write. Thus, organization and access to simulation data have been major obstacles to the discovery of new knowledge in the Dynameomics project. This repository is used internally and is the foundation of the Dynameomics portal site http://www.dynameomics.org. By organizing simulation data into a scalable, manageable and accessible form, we can begin to address substantial questions that move us closer to solving biomedical and bioengineering problems.

  16. Evolution of a Digital Repository: One Institution's Experience

    ERIC Educational Resources Information Center

    Owen, Terry M.

    2011-01-01

    In this article, the development of a digital repository is examined, specifically how the focus on acquiring content for the repository has transitioned from faculty-published research to include the gray literature produced by the research centers on campus, including unpublished technical reports and undergraduate research from honors programs.…

  17. Large-Scale In-situ Experiments to Determine Geochemical Alterations and Microbial Activities at the Geological Repository

    NASA Astrophysics Data System (ADS)

    Choung, S.; Francis, A. J.; Um, W.; Choi, S.; Kim, S.; Park, J.; Kim, S.

    2013-12-01

    The countries that have generated nuclear power have facing problems on the disposal of accumulated radioactive wastes. Geological disposal method has been chosen in many countries including Korea. A safety issue after the closure of geological repository has been raised, because microbial activities lead overpressure in the underground facilities through gas production. In particular, biodegradable organic materials derived from low- and intermediate-level radioactive wastes play important role on microbial activities in the geological repository. This study performed large scale in-situ experiments using organic wastes and groundwater, and investigated geochemical alteration and microbial activities at early stage (~63 days) as representative of the period, after closure of the geological repository. The geochemical alteration controlled significantly the microorganism types and populations. Database of the biogeochemical alteration facilitates prediction of radionuclides' mobility and establishment of remedial strategy against unpredictable accidents and hazards at early stage right after closure of the geological repository.

  18. Data model, dictionaries, and desiderata for biomolecular simulation data indexing and sharing

    PubMed Central

    2014-01-01

    Background Few environments have been developed or deployed to widely share biomolecular simulation data or to enable collaborative networks to facilitate data exploration and reuse. As the amount and complexity of data generated by these simulations is dramatically increasing and the methods are being more widely applied, the need for new tools to manage and share this data has become obvious. In this paper we present the results of a process aimed at assessing the needs of the community for data representation standards to guide the implementation of future repositories for biomolecular simulations. Results We introduce a list of common data elements, inspired by previous work, and updated according to feedback from the community collected through a survey and personal interviews. These data elements integrate the concepts for multiple types of computational methods, including quantum chemistry and molecular dynamics. The identified core data elements were organized into a logical model to guide the design of new databases and application programming interfaces. Finally a set of dictionaries was implemented to be used via SQL queries or locally via a Java API built upon the Apache Lucene text-search engine. Conclusions The model and its associated dictionaries provide a simple yet rich representation of the concepts related to biomolecular simulations, which should guide future developments of repositories and more complex terminologies and ontologies. The model still remains extensible through the decomposition of virtual experiments into tasks and parameter sets, and via the use of extended attributes. The benefits of a common logical model for biomolecular simulations was illustrated through various use cases, including data storage, indexing, and presentation. All the models and dictionaries introduced in this paper are available for download at http://ibiomes.chpc.utah.edu/mediawiki/index.php/Downloads. PMID:24484917

  19. Large scale rigidity-based flexibility analysis of biomolecules

    PubMed Central

    Streinu, Ileana

    2016-01-01

    KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583

  20. Evaluation of Groundwater Pathways and Travel Times From the Nevada Test Site to the Potential Yucca Mountain Repository

    NASA Astrophysics Data System (ADS)

    Pohlmann, K. F.; Zhu, J.; Ye, M.; Carroll, R. W.; Chapman, J. B.; Russell, C. E.; Shafer, D. S.

    2006-12-01

    Yucca Mountain (YM), Nevada has been recommended as a deep geological repository for the disposal of spent fuel and high-level radioactive waste. If YM is licensed as a repository by the Nuclear Regulatory Commission, it will be important to identify the potential for radionuclides to migrate from underground nuclear testing areas located on the Nevada Test Site (NTS) to the hydraulically downgradient repository area to ensure that monitoring does not incorrectly attribute repository failure to radionuclides originating from other sources. In this study, we use the Death Valley Regional Flow System (DVRFS) model developed by the U.S. Geological Survey to investigate potential groundwater migration pathways and associated travel times from the NTS to the proposed YM repository area. Using results from the calibrated DVRFS model and the particle tracking post-processing package MODPATH we modeled three-dimensional groundwater advective pathways in the NTS and YM region. Our study focuses on evaluating the potential for groundwater pathways between the NTS and YM withdrawal area and whether travel times for advective flow along these pathways coincide with the prospective monitoring time frame at the proposed repository. We include uncertainty in effective porosity as this is a critical variable in the determination of time for radionuclides to travel from the NTS region to the YM withdrawal area. Uncertainty in porosity is quantified through evaluation of existing site data and expert judgment and is incorporated in the model through Monte Carlo simulation. Since porosity information is limited for this region, the uncertainty is quite large and this is reflected in the results as a large range in simulated groundwater travel times.

  1. Coupled Heat and Moisture Transport Simulation on the Re-saturation of Engineered Clay Barrier

    NASA Astrophysics Data System (ADS)

    Huang, W. H.; Chuang, Y. F.

    2014-12-01

    Engineered clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation processes of clay barrier, with emphasis on the coupling effects of heat and moisture during the intrusion of groundwater to the repository. A reference bentonite and a locally available clay were adopted in the laboratory program. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures so as to determine the soil water characteristic curves of the two clays at different temperatures. And water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the clay barrier. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. It was found that soil suction decreases as temperature increases, resulting in a reduction in water retention capability. The finite element method was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on the clays. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. Finally, the model was then used to evaluate the effect of clay barrier thickness on the time required for groundwater to penetrate the clay barrier and approach saturation. Due to the variation in clay suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.

  2. A Comparison of Subject and Institutional Repositories in Self-Archiving Practices

    ERIC Educational Resources Information Center

    Xia, Jingfeng

    2008-01-01

    The disciplinary culture theory presumes that if a scholar has been familiar with self-archiving through an existing subject-based repository, this scholar will be more enthusiastic about contributing his/her research to an institutional repository than one who has not had the experience. To test the theory, this article examines self-archiving…

  3. Deep Boreholes Seals Subjected to High P,T conditions - Proposed Experimental Studies

    NASA Astrophysics Data System (ADS)

    Caporuscio, F.

    2015-12-01

    Deep borehole experimental work will constrain the P,T conditions which "seal" material will experience in deep borehole crystalline rock repositories. The rocks of interest to this study include mafic (amphibolites) and silicic (granitic gneiss) end members. The experiments will systematically add components to capture discrete changes in both water and EBS component chemistries. Experiments in the system wall rock-clay-concrete-groundwater will evaluate interactions among components, including: mineral phase stability, metal corrosion rates and thermal limits. Based on engineered barrier studies, experimental investigations will move forward with three focusses. First, evaluation of interaction between "seal" materials and repository wall rock (crystalline) under fluid-saturated conditions over long-term (i.e., six-month) experiments; which reproduces the thermal pulse event of a repository. Second, perform experiments to determine the stability of zeolite minerals (analcime-wairakitess) under repository conditions. Both sets of experiments are critically important for understanding mineral paragenesis (zeolites and/or clay transformations) associated with "seals" in contact with wall rock at elevated temperatures. Third, mineral growth at the metal interface is a principal control on the survivability (i.e. corrosion) of waste canisters in a repository. The objective of this planned experimental work is to evaluate physio-chemical processes for 'seal' components and materials relevant to deep borehole disposal. These evaluations will encompass multi-laboratory efforts for the development of seals concepts and application of Thermal-Mechanical-Chemical (TMC) modeling work to assess barrier material interactions with subsurface fluids and other barrier materials, their stability at high temperatures, and the implications of these processes to the evaluation of thermal limits.

  4. HepSim: A repository with predictions for high-energy physics experiments

    DOE PAGES

    Chekanov, S. V.

    2015-02-03

    A file repository for calculations of cross sections and kinematic distributions using Monte Carlo generators for high-energy collisions is discussed. The repository is used to facilitate effective preservation and archiving of data from theoretical calculations and for comparisons with experimental data. The HepSim data library is publicly accessible and includes a number of Monte Carlo event samples with Standard Model predictions for current and future experiments. The HepSim project includes a software package to automate the process of downloading and viewing online Monte Carlo event samples. Data streaming over a network for end-user analysis is discussed.

  5. jPOSTrepo: an international standard data repository for proteomes

    PubMed Central

    Okuda, Shujiro; Watanabe, Yu; Moriya, Yuki; Kawano, Shin; Yamamoto, Tadashi; Matsumoto, Masaki; Takami, Tomoyo; Kobayashi, Daiki; Araki, Norie; Yoshizawa, Akiyasu C.; Tabata, Tsuyoshi; Sugiyama, Naoyuki; Goto, Susumu; Ishihama, Yasushi

    2017-01-01

    Major advancements have recently been made in mass spectrometry-based proteomics, yielding an increasing number of datasets from various proteomics projects worldwide. In order to facilitate the sharing and reuse of promising datasets, it is important to construct appropriate, high-quality public data repositories. jPOSTrepo (https://repository.jpostdb.org/) has successfully implemented several unique features, including high-speed file uploading, flexible file management and easy-to-use interfaces. This repository has been launched as a public repository containing various proteomic datasets and is available for researchers worldwide. In addition, our repository has joined the ProteomeXchange consortium, which includes the most popular public repositories such as PRIDE in Europe for MS/MS datasets and PASSEL for SRM datasets in the USA. Later MassIVE was introduced in the USA and accepted into the ProteomeXchange, as was our repository in July 2016, providing important datasets from Asia/Oceania. Accordingly, this repository thus contributes to a global alliance to share and store all datasets from a wide variety of proteomics experiments. Thus, the repository is expected to become a major repository, particularly for data collected in the Asia/Oceania region. PMID:27899654

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reedlunn, Benjamin

    Room D was an in-situ, isothermal, underground experiment conducted at the Waste Isolation Pilot Plant between 1984 and 1991. The room was carefully instrumented to measure the horizontal and vertical closure immediately upon excavation and for several years thereafter. Early finite element simulations of salt creep around Room D under-predicted the vertical closure by 4.5×, causing investigators to explore a series of changes to the way Room D was modeled. Discrepancies between simulations and measurements were resolved through a series of adjustments to model parameters, which were openly acknowledged in published reports. Interest in Room D has been rekindled recentlymore » by the U.S./German Joint Project III and Project WEIMOS, which seek to improve the predictions of rock salt constitutive models. Joint Project participants calibrate their models solely against laboratory tests, and benchmark the models against underground experiments, such as room D. This report describes updating legacy Room D simulations to today’s computational standards by rectifying several numerical issues. Subsequently, the constitutive model used in previous modeling is recalibrated two different ways against a suite of new laboratory creep experiments on salt extracted from the repository horizon of the Waste Isolation Pilot Plant. Simulations with the new, laboratory-based, calibrations under-predict Room D vertical closure by 3.1×. A list of potential improvements is discussed.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reedlunn, Benjamin

    Room D was an in-situ, isothermal, underground experiment conducted at theWaste Isolation Pilot Plant between 1984 and 1991. The room was carefully instrumented to measure the horizontal and vertical closure immediately upon excavation and for several years thereafter. Early finite element simulations of salt creep around Room D under predicted the vertical closure by 4.5×, causing investigators to explore a series of changes to the way Room D was modeled. Discrepancies between simulations and measurements were resolved through a series of adjustments to model parameters, which were openly acknowledged in published reports. Interest in Room D has been rekindled recentlymore » by the U.S./German Joint Project III and Project WEIMOS, which seek to improve the predictions of rock salt constitutive models. Joint Project participants calibrate their models solely against laboratory tests, and benchmark the models against underground experiments, such as room D. This report describes updating legacy Room D simulations to today’s computational standards by rectifying several numerical issues. Subsequently, the constitutive model used in previous modeling is recalibrated two different ways against a suite of new laboratory creep experiments on salt extracted from the repository horizon of the Waste Isolation Pilot Plant. Simulations with the new, laboratory-based, calibrations under predict Room D vertical closure by 3.1×. A list of potential improvements is discussed.« less

  8. Simulation of gas phase transport of carbon-14 at Yucca Mountain, Nevada, USA

    USGS Publications Warehouse

    Lu, N.; Ross, B.

    1994-01-01

    We have simulated gas phase transport of Carbon-14 at Yucca Mountain, Nevada. Three models were established to calculate travel time of Carbon-14 from the potential repository to the mountain surface: a geochemical model for retardation factors, a coupled gas-flow and heat transfer model for temperature and gas flow fields, and a particle tracker for travel time calculation. The simulations used three parallel, east-west cross-sections that were taken from the Sandia National Laboratories Interactive Graphics Information System (IGIS). Assuming that the repository is filled with 30- year-old waste at an initial areal power density of 57 kw/acre, we found that repository temperatures remain above 60??C for more than 10,000 years. For a tuff permeability of 10-7 cm2, Carbon-14 travel times to the surface are mostly less than 1,000 years, for particles starting at any time within the first 10,000 years. If the tuff permeability is 10-8 cm2, however, Carbon- 14 travel times to the surface range from 3,000 to 12,000 years, for particle starting within the 10,000 years.

  9. Numerical Modeling of Thermal-Hydrology in the Near Field of a Generic High-Level Waste Repository

    NASA Astrophysics Data System (ADS)

    Matteo, E. N.; Hadgu, T.; Park, H.

    2016-12-01

    Disposal in a deep geologic repository is one of the preferred option for long term isolation of high-level nuclear waste. Coupled thermal-hydrologic processes induced by decay heat from the radioactive waste may impact fluid flow and the associated migration of radionuclides. This study looked at the effects of those processes in simulations of thermal-hydrology for the emplacement of U. S. Department of Energy managed high-level waste and spent nuclear fuel. Most of the high-level waste sources have lower thermal output which would reduce the impact of thermal propagation. In order to quantify the thermal limits this study concentrated on the higher thermal output sources and on spent nuclear fuel. The study assumed a generic nuclear waste repository at 500 m depth. For the modeling a representative domain was selected representing a portion of the repository layout in order to conduct a detailed thermal analysis. A highly refined unstructured mesh was utilized with refinements near heat sources and at intersections of different materials. Simulations looked at different values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock). The simulations also looked at the effects of different durations of surface aging of the waste to reduce thermal perturbations. The PFLOTRAN code (Hammond et al., 2014) was used for the simulations. Modeling results for the different options are reported and include temperature and fluid flow profiles in the near field at different simulation times. References:G. E. Hammond, P.C. Lichtner and R.T. Mills, "Evaluating the Performance of Parallel Subsurface Simulators: An Illustrative Example with PFLOTRAN", Water Resources Research, 50, doi:10.1002/2012WR013483 (2014). Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2016-7510 A

  10. Bentonite Clay Evolution at Elevated Pressures and Temperatures: An experimental study for generic nuclear repositories

    NASA Astrophysics Data System (ADS)

    Caporuscio, F. A.; Cheshire, M.; McCarney, M.

    2012-12-01

    The Used Fuel Disposition Campaign is presently engaged in looking at various generic repository options for disposal of used fuel. Of interest are the disposal of high heat load canisters ,which may allow for a reduced repository footprint. The focus of this experimental work is to characterize Engineered Barrier Systems (EBS) conditions in repositories. Clay minerals - as backfill or buffer materials - are critical to the performance of the EBS. Experiments were performed in Dickson cells at 150 bar and sequentially stepped from 125 oC to 300 oC over a period of ~1 month. An unprocessed bentonite from Colony, Wyoming was used as the buffer material in each experiment. An K-Ca-Na-Cl-rich brine (replicating deep Stripa groundwater) was used at a 9:1 water:rock ratio. The baseline experiment contained brine + clay, while three other experiments contained metals that could be used as waste form canisters (brine +clay+304SS, brine+clay+316SS, brine+clay+Cu). All experiments were buffered at the Mt-Fe oxygen fugacity univarient line. As experiment temperature increased and time progressed, pH, K and Ca ion concentrations dropped, while Si, Na, and SO4 concentrations increased. Silicon was liberated into the fluid phase (>1000 ppm) and precipitated during the quenching of the experiment. The precipitated silica transformed to cristobalite as cooling progressed. Potassium was mobilized and exchanged with interlayer Na, transitioning the clay from Na-montmorillonite to K-smectite. Though illitization was not observed in these experiments, its formation may be kinetically limited and longer-term experiments are underway to evaluate the equilibrium point in this reaction. Clinoptilolite present in the starting bentonite mixture is unstable above 150 oC. Hence, the zeolite broke down at high temperatures but recrystallized as the quench event occurred. This was borne out in SEM images that showed clinoptilolite as a very late stage growth mineral. Both experimental runs containing steel exhibit the generation of a chlorite / Fe-saponite layer at the clay-metal boundary. The formation of minor amounts of pentlandite [(Fe,Ni)9S8] also occurs on both steel plates. Chalcocite (Cu2S) formed as a corrosion product on the Cu plates. The two sulfide phases have been produced by the generation of H2S gas during the experimental runs. The H2S is formed by the breakdown of pyrite framboids at high temperature in the bentonite. Such experiments on representative EBS materials at elevated P,T repository conditions are providing useful information for generic repository studies. Lack of illite formation is common in clay experiments and may be related to kinetics or K concentration. Precipitated SiO2 may potentially seal heating cracks in the clay backfill. The chlorite layer generated on steel may act as a passivation material and prevent corrosion of the steel canister wall. Finally, even if zeolites break down during the high temperature thermal pulse of a repository, zeolites may form again as the repository inventory cools off and perform as radionuclide sorbing phases.

  11. Modelling the Mont Terri HE-D experiment for the Thermal–Hydraulic–Mechanical response of a bedded argillaceous formation to heating

    DOE PAGES

    Garitte, B.; Nguyen, T. S.; Barnichon, J. D.; ...

    2017-05-09

    Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less

  12. Modelling the Mont Terri HE-D experiment for the Thermal–Hydraulic–Mechanical response of a bedded argillaceous formation to heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garitte, B.; Nguyen, T. S.; Barnichon, J. D.

    Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less

  13. Simulation of ventilation efficiency, and pre-closure temperatures in emplacement drifts at Yucca Mountain, Nevada, using Monte Carlo and composite thermal-pulse methods

    USGS Publications Warehouse

    Case, J.B.; Buesch, D.C.

    2004-01-01

    Predictions of waste canister and repository driftwall temperatures as functions of space and time are important to evaluate pre-closure performance of the proposed repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. Variations in the lithostratigraphic features in densely welded and crystallized rocks of the 12.8-million-year-old Topopah Spring Tuff, especially the porosity resulting from lithophysal cavities, affect thermal properties. A simulated emplacement drift is based on projecting lithophysal cavity porosity values 50 to 800 m from the Enhanced Characterization of the Repository Block cross drift. Lithophysal cavity porosity varies from 0.00 to 0.05 cm3/cm3 in the middle nonlithophysal zone and from 0.03 to 0.28 cm3/cm3 in the lower lithophysal zone. A ventilation model and computer program titled "Monte Carlo Simulation of Ventilation" (MCSIMVENT), which is based on a composite thermal-pulse calculation, simulates statistical variability and uncertainty of rock-mass thermal properties and ventilation performance along a simulated emplacement drift for a pre-closure period of 50 years. Although ventilation efficiency is relatively insensitive to thermal properties, variations in lithophysal porosity along the drift can result in a range of peak driftwall temperatures can range from 40 to 85??C for the preclosure period. Copyright ?? 2004 by ASME.

  14. Analysis of model output and science data in the Virtual Model Repository (VMR).

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2014-12-01

    Big scientific data not only includes large repositories of data from scientific platforms like satelites and ground observation, but also the vast output of numerical models. The Virtual Model Repository (VMR) provides scientific analysis and visualization tools for a many numerical models of the Earth-Sun system. Individual runs can be analyzed in the VMR and compared to relevant data through relevant metadata, but larger collections of runs can also now be studied and statistics generated on the accuracy and tendancies of model output. The vast model repository at the CCMC with over 1000 simulations of the Earth's magnetosphere was used to look at overall trends in accuracy when compared to satelites such as GOES, Geotail, and Cluster. Methodology for this analysis as well as case studies will be presented.

  15. Data Citation Concept for CMIP6

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Toussaint, F.; Lautenschlager, M.; Lawrence, B.

    2015-12-01

    There is a broad consensus among data centers and scientific publishers on Force 11's 'Joint Declaration of Data Citation Principles'. To put these principles into operation is not always as straight forward. The focus for CMIP6 data citations lies on the citation of data created by others and used in an analysis underlying the article. And for this source data usually no article of the data creators is available ('stand-alone data publication'). The planned data citation granularities are model data (data collections containing all datasets provided for the project by a single model) and experiment data (data collections containing all datasets for a scientific experiment run by a single model). In case of large international projects or activities like CMIP, the data is commonly stored and disseminated by multiple repositories in a federated data infrastructure such as the Earth System Grid Federation (ESGF). The individual repositories are subject to different institutional and national policies. A Data Management Plan (DMP) will define a certain standard for the repositories including data handling procedures. Another aspect of CMIP data, relevant for data citations, is its dynamic nature. For such large data collections, datasets are added, revised and retracted for years, before the data collection becomes stable for a data citation entity including all model or simulation data. Thus, a critical issue for ESGF is data consistency, requiring thorough dataset versioning to enable the identification of the data collection in the cited version. Currently, the ESGF is designed for accessing the latest dataset versions. Data citation introduces the necessity to support older and retracted dataset versions by storing metadata even beyond data availability (data unpublished in ESGF). Apart from ESGF, other infrastructure components exist for CMIP, which provide information that has to be connected to the CMIP6 data, e.g. ES-DOC providing information on models and simulations and the IPCC Data Distribution Centre (DDC) storing a subset of data together with available metadata (ES-DOC) for the long-term reuse of the interdisciplinary community. Other connections exist to standard project vocabularies, to personal identifiers (e.g. ORCID), or to data products (including provenance information).

  16. Simultaneous flow of gas and water in a damage-susceptible argillaceous rock

    NASA Astrophysics Data System (ADS)

    Nguyen, T. S.

    2011-12-01

    A research project has been initiated by the Canadian Nuclear Safety Commission (CNSC) to study the influence of gas generation and migration on the long term safety of deep geological repositories for radioactive wastes. Such facilities rely on multiple barriers to isolate and contain the wastes. Depending on the level of radioactivity of the wastes, those barriers include some or all of the following: corrosion and structurally resistant containers, low permeability seals around the emplacements rooms, galleries and shaft, and finally the host rock formations. Large quantities of gas may be generated from the degradation of the waste forms or the corrosion of the containers. The generated gas pressures, if sufficiently large, can induce cracks and microcracks in the engineered and natural barriers and affect their containment functions. The author has developed a mathematical model to simulate the above effects. The model must be calibrated and validated with laboratory and field experiments in order to provide confidence in its future use for assessing the effects of gas on the long term safety of nuclear wastes repositories. The present communication describes the model and its use in the simulation of laboratory and large scale in-situ gas injection experiments in an argillaceous rock, known as Opalinus clay, from Mont Terri, Switzerland. Both the laboratory and in-situ experiments show that the gas flow rate substantially increases when the injection pressure is higher than the confining stress. The above observation seems to indicate that at high gas injection pressures, damage could possibly be induced in the rock formation resulting in an important increase in its permeability. In order to simulate the experiments, we developed a poro-elastoplastic model, with the consideration of two compressible pore fluids (water and gas). The bulk movement of the pore fluids is assumed to obey the generalized Darcy's law, and their respective degree of saturation is represented by the Van Genuchten's functions. The solid skeleton is assumed to be elastoplastic, with degradation of the strength and elastic modulus accompanied by an increase in permeability when damage is accumulated. The model can predict the three distinct flow regimes found in the experiments: a low flow regime where gas movement is restricted to the injection zone, a moderate flow regime when damage is limited, and a high flow regime when damage induces a substantial increase in the permeability.

  17. New Rapid Evaluation for Long-Term Behavior in Deep Geological Repository by Geotechnical Centrifuge—Part 2: Numerical Simulation of Model Tests in Isothermal Condition

    NASA Astrophysics Data System (ADS)

    Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji

    2017-01-01

    In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.

  18. Performance Assessments of Generic Nuclear Waste Repositories in Shale

    NASA Astrophysics Data System (ADS)

    Stein, E. R.; Sevougian, S. D.; Mariner, P. E.; Hammond, G. E.; Frederick, J.

    2017-12-01

    Simulations of deep geologic disposal of nuclear waste in a generic shale formation showcase Geologic Disposal Safety Assessment (GDSA) Framework, a toolkit for repository performance assessment (PA) whose capabilities include domain discretization (Cubit), multiphysics simulations (PFLOTRAN), uncertainty and sensitivity analysis (Dakota), and visualization (Paraview). GDSA Framework is used to conduct PAs of two generic repositories in shale. The first considers the disposal of 22,000 metric tons heavy metal of commercial spent nuclear fuel. The second considers disposal of defense-related spent nuclear fuel and high level waste. Each PA accounts for the thermal load and radionuclide inventory of applicable waste types, components of the engineered barrier system, and components of the natural barrier system including the host rock shale and underlying and overlying stratigraphic units. Model domains are half-symmetry, gridded with Cubit, and contain between 7 and 22 million grid cells. Grid refinement captures the detail of individual waste packages, emplacement drifts, access drifts, and shafts. Simulations are run in a high performance computing environment on as many as 2048 processes. Equations describing coupled heat and fluid flow and reactive transport are solved with PFLOTRAN, an open-source, massively parallel multiphase flow and reactive transport code. Additional simulated processes include waste package degradation, waste form dissolution, radioactive decay and ingrowth, sorption, solubility, advection, dispersion, and diffusion. Simulations are run to 106 y, and radionuclide concentrations are observed within aquifers at a point approximately 5 km downgradient of the repository. Dakota is used to sample likely ranges of input parameters including waste form and waste package degradation rates and properties of engineered and natural materials to quantify uncertainty in predicted concentrations and sensitivity to input parameters. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017- 8305 A

  19. Gas and water flow in an excavation-induced fracture network around an underground drift: A case study for a radioactive waste repository in clay rock

    NASA Astrophysics Data System (ADS)

    de La Vaissière, Rémi; Armand, Gilles; Talandier, Jean

    2015-02-01

    The Excavation Damaged Zone (EDZ) surrounding a drift, and in particular its evolution, is being studied for the performance assessment of a radioactive waste underground repository. A specific experiment (called CDZ) was designed and implemented in the Meuse/Haute-Marne Underground Research Laboratory (URL) in France to investigate the EDZ. This experiment is dedicated to study the evolution of the EDZ hydrogeological properties (conductivity and specific storage) of the Callovo-Oxfordian claystone under mechanical compression and artificial hydration. Firstly, a loading cycle applied on a drift wall was performed to simulate the compression effect from bentonite swelling in a repository drift (bentonite is a clay material to be used to seal drifts and shafts for repository closure purpose). Gas tests (permeability tests with nitrogen and tracer tests with helium) were conducted during the first phase of the experiment. The results showed that the fracture network within the EDZ was initially interconnected and opened for gas flow (particularly along the drift) and then progressively closed with the increasing mechanical stress applied on the drift wall. Moreover, the evolution of the EDZ after unloading indicated a self-sealing process. Secondly, the remaining fracture network was resaturated to demonstrate the ability to self-seal of the COx claystone without mechanical loading by conducting from 11 to 15 repetitive hydraulic tests with monitoring of the hydraulic parameters. During this hydration process, the EDZ effective transmissivity dropped due to the swelling of the clay materials near the fracture network. The hydraulic conductivity evolution was relatively fast during the first few days. Low conductivities ranging at 10-10 m/s were observed after four months. Conversely, the specific storage showed an erratic evolution during the first phase of hydration (up to 60 days). Some uncertainty remains on this parameter due to volumetric strain during the sealing of the fractures. The hydration was stopped after one year and cross-hole hydraulic tests were performed to determine more accurately the specific storage as well as the hydraulic conductivity at a meter-scale. All hydraulic conductivity values measured at the injection interval and at the observation intervals were all below 10-10 m/s. Moreover, the preferential inter-connectivity along the drift disappeared. Specific storage values at the observation and injection intervals were similar. Furthermore they were in agreement with the value obtained at the injection interval within the second hydration phase (60 days after starting hydration). The graphical abstract synthesizes the evolution of the hydraulic/gas conductivity for 8 intervals since the beginning of the CDZ experiment. The conductivity limit of 10-10 m/s corresponds to the lower bound hydraulic definition of the EDZ and it is demonstrated that EDZ can be sealed. This is a significant result in the demonstration of the long-term safety of a repository.

  20. On the importance of coupled THM processes to predict the long-term response of a generic salt repository for high-level nuclear waste

    NASA Astrophysics Data System (ADS)

    Blanco Martin, L.; Rutqvist, J.; Birkholzer, J. T.

    2013-12-01

    Salt is a potential medium for the underground disposal of nuclear waste because it has several assets, in particular its ability to creep and heal fractures generated by excavation and its water and gas tightness in the undisturbed state. In this research, we focus on disposal of heat-generating nuclear waste (such as spent fuel) and we consider a generic salt repository with in-drift emplacement of waste packages and subsequent backfill of the drifts with run-of-mine crushed salt. As the natural salt creeps, the crushed salt backfill gets progressively compacted and an engineered barrier system is subsequently created. In order to evaluate the integrity of the natural and engineered barriers over the long-term, it is important to consider the coupled effects of the thermal, hydraulic and mechanical processes that take place. In particular, the results obtained so far show how the porosity reduction of the crushed salt affects the saturation and pore pressure evolution throughout the repository, both in time and space. Such compaction is induced by the stress and temperature regime within the natural salt. Also, transport properties of the host rock are modified not only by thermo-mechanically and hydraulically-induced damaged processes, but also by healing/sealing of existing fractures. In addition, the THM properties of the backfill evolve towards those of the natural salt during the compaction process. All these changes are based on dedicated laboratory experiments and on theoretical considerations [1-3]. Different scenarios are modeled and compared to evaluate the relevance of different processes from the perspective of effective nuclear waste repositories. The sensitivity of the results to some parameters, such as capillarity, is also addressed. The simulations are conducted using an updated version of the TOUGH2-FLAC3D simulator, which is based on a sequential explicit method to couple flow and geomechanics [4]. A new capability for large strains and creep has been introduced and validated. The time-dependent geomechanical response of salt is determined using the Lux/Wolters constitutive model, developed at Clausthal University of Technology (Germany). References: [1] R. Wolters, and K.-H. Lux. Evaluation of Rock Salt Barriers with Respect to Tightness: Influence of Thermomechanical Damage, Fluid Infiltration and Sealing/Healing. Proceedings of the 7th International Conference on the Mechanical Behavior of Salt (SaltMech7). Paris: Balkema, Rotterdam (2012). [2] W. Bechthold et al., Backfilling and Sealing of Underground Repositories for Radioactive Waste in Salt (BAMBUS Project), European Atomic Energy Community, Report EUR19124 EN (1999). [3] J. Kim, E.L Sonnenthal and J. Rutqvist, 'Formulation and sequential numerical algorithms of coupled fluid/heat flow and geomechanics for multiple porosity materials', Int. J. Numer. Meth. Engng., 92, 425 (2012). [4] J. Rutqvist. Status of the TOUGH-FLAC simulator and recent applications related to coupled fluid flow and crustal deformations. Computational Geosciences, 37, 739-750 (2011).

  1. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2018-03-27

    Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen's d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach.

  2. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture

    PubMed Central

    2018-01-01

    Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen’s d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach. PMID:29584703

  3. BeatBox-HPC simulation environment for biophysically and anatomically realistic cardiac electrophysiology.

    PubMed

    Antonioletti, Mario; Biktashev, Vadim N; Jackson, Adrian; Kharche, Sanjay R; Stary, Tomas; Biktasheva, Irina V

    2017-01-01

    The BeatBox simulation environment combines flexible script language user interface with the robust computational tools, in order to setup cardiac electrophysiology in-silico experiments without re-coding at low-level, so that cell excitation, tissue/anatomy models, stimulation protocols may be included into a BeatBox script, and simulation run either sequentially or in parallel (MPI) without re-compilation. BeatBox is a free software written in C language to be run on a Unix-based platform. It provides the whole spectrum of multi scale tissue modelling from 0-dimensional individual cell simulation, 1-dimensional fibre, 2-dimensional sheet and 3-dimensional slab of tissue, up to anatomically realistic whole heart simulations, with run time measurements including cardiac re-entry tip/filament tracing, ECG, local/global samples of any variables, etc. BeatBox solvers, cell, and tissue/anatomy models repositories are extended via robust and flexible interfaces, thus providing an open framework for new developments in the field. In this paper we give an overview of the BeatBox current state, together with a description of the main computational methods and MPI parallelisation approaches.

  4. Reactive transport studies at the Raymond Field Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freifeld, B.; Karasaki, K.; Solbau, R.

    1995-12-01

    To ensure the safety of a nuclear waste repository, an understanding of the transport of radionuclides from the repository nearfield to the biosphere is necessary. At the Raymond Field Site, in Raymond, California, tracer tests are being conducted to test characterization methods for fractured media and to evaluate the equipment and tracers that will be used for Yucca Mountain`s fracture characterization. Recent tracer tests at Raymond have used reactive cations to demonstrate transport with sorption. A convective-dispersive model was used to simulate a two-well recirculating test with reasonable results. However, when the same model was used to simulate a radiallymore » convergent tracer test, the model poorly predicted the actual test data.« less

  5. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  6. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  7. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  8. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  9. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  10. A Unified Framework for Brain Segmentation in MR Images

    PubMed Central

    Yazdani, S.; Yusof, R.; Karimian, A.; Riazi, A. H.; Bennamoun, M.

    2015-01-01

    Brain MRI segmentation is an important issue for discovering the brain structure and diagnosis of subtle anatomical changes in different brain diseases. However, due to several artifacts brain tissue segmentation remains a challenging task. The aim of this paper is to improve the automatic segmentation of brain into gray matter, white matter, and cerebrospinal fluid in magnetic resonance images (MRI). We proposed an automatic hybrid image segmentation method that integrates the modified statistical expectation-maximization (EM) method and the spatial information combined with support vector machine (SVM). The combined method has more accurate results than what can be achieved with its individual techniques that is demonstrated through experiments on both real data and simulated images. Experiments are carried out on both synthetic and real MRI. The results of proposed technique are evaluated against manual segmentation results and other methods based on real T1-weighted scans from Internet Brain Segmentation Repository (IBSR) and simulated images from BrainWeb. The Kappa index is calculated to assess the performance of the proposed framework relative to the ground truth and expert segmentations. The results demonstrate that the proposed combined method has satisfactory results on both simulated MRI and real brain datasets. PMID:26089978

  11. Institutional Repositories: The Experience of Master's and Baccalaureate Institutions

    ERIC Educational Resources Information Center

    Markey, Karen; St. Jean, Beth; Soo, Young Rieh; Yakel, Elizabeth; Kim, Jihyun

    2008-01-01

    In 2006, MIRACLE Project investigators censused library directors at all U.S. academic institutions about their activities planning, pilot testing, and implementing the institutional repositories on their campuses. Out of 446 respondents, 289 (64.8 percent) were from master's and baccalaureate institutions (M&BIs) where few operational…

  12. Research Students and the Loughborough Institutional Repository

    ERIC Educational Resources Information Center

    Pickton, Margaret; McKnight, Cliff

    2006-01-01

    This article investigates the potential role for research students in an institutional repository (IR). Face-to-face interviews with 34 research students at Loughborough University were carried out. Using a mixture of closed and open questions, the interviews explored the students' experiences and opinions of publishing, open access and the…

  13. Indian Institutional Repositories: A Study of User's Perspective

    ERIC Educational Resources Information Center

    Sawant, Sarika

    2012-01-01

    Purpose: The present study aims to investigate the experience, contribution and opinions of users of respective institutional repositories (IRs) developed in India. Design/methodology/approach: The survey method was used. The data collection tool was a web questionnaire, which was created with the help of software provided by surveymonkey.com…

  14. Collaborative Recommendation of E-Learning Resources: An Experimental Investigation

    ERIC Educational Resources Information Center

    Manouselis, N.; Vuorikari, R.; Van Assche, F.

    2010-01-01

    Repositories with educational resources can support the formation of online learning communities by providing a platform for collaboration. Users (e.g. teachers, tutors and learners) access repositories, search for interesting resources to access and use, and in many cases, also exchange experiences and opinions. A particular class of online…

  15. Methods Data Qualification Interim Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Sam Alessi; Tami Grimmett; Leng Vang

    The overall goal of the Next Generation Nuclear Plant (NGNP) Data Management and Analysis System (NDMAS) is to maintain data provenance for all NGNP data including the Methods component of NGNP data. Multiple means are available to access data stored in NDMAS. A web portal environment allows users to access data, view the results of qualification tests and view graphs and charts of various attributes of the data. NDMAS also has methods for the management of the data output from VHTR simulation models and data generated from experiments designed to verify and validate the simulation codes. These simulation models representmore » the outcome of mathematical representation of VHTR components and systems. The methods data management approaches described herein will handle data that arise from experiment, simulation, and external sources for the main purpose of facilitating parameter estimation and model verification and validation (V&V). A model integration environment entitled ModelCenter is used to automate the storing of data from simulation model runs to the NDMAS repository. This approach does not adversely change the why computational scientists conduct their work. The method is to be used mainly to store the results of model runs that need to be preserved for auditing purposes or for display to the NDMAS web portal. This interim report demonstrates the currently development of NDMAS for Methods data and discusses data and its qualification that is currently part of NDMAS.« less

  16. Monte Carlo simulations for generic granite repository studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Shaoping; Lee, Joon H; Wang, Yifeng

    In a collaborative study between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) for the DOE-NE Office of Fuel Cycle Technologies Used Fuel Disposition (UFD) Campaign project, we have conducted preliminary system-level analyses to support the development of a long-term strategy for geologic disposal of high-level radioactive waste. A general modeling framework consisting of a near- and a far-field submodel for a granite GDSE was developed. A representative far-field transport model for a generic granite repository was merged with an integrated systems (GoldSim) near-field model. Integrated Monte Carlo model runs with the combined near- and farfield transport modelsmore » were performed, and the parameter sensitivities were evaluated for the combined system. In addition, a sub-set of radionuclides that are potentially important to repository performance were identified and evaluated for a series of model runs. The analyses were conducted with different waste inventory scenarios. Analyses were also conducted for different repository radionuelide release scenarios. While the results to date are for a generic granite repository, the work establishes the method to be used in the future to provide guidance on the development of strategy for long-term disposal of high-level radioactive waste in a granite repository.« less

  17. Simulator sickness research program at NASA-Ames Research Center

    NASA Technical Reports Server (NTRS)

    Mccauley, Michael E.; Cook, Anthony M.

    1987-01-01

    The simulator sickness syndrome is receiving increased attention in the simulation community. NASA-Ames Research Center has initiated a program to facilitate the exchange of information on this topic among the tri-services and other interested government organizations. The program objectives are to identify priority research issues, promote efficient research strategies, serve as a repository of information, and disseminate information to simulator users.

  18. Introducing sampling entropy in repository based adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Zheng, Han; Zhang, Yingkai

    2009-12-01

    Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.

  19. Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results

    NASA Astrophysics Data System (ADS)

    Nussbaum, C. O.; Bossart, P. J.

    2012-12-01

    Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.

  20. Migration of the Gaudi and LHCb software repositories from CVS to Subversion

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Degaudenzi, H.; LHCb Collaboration

    2011-12-01

    A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.

  1. Digital Rocks Portal: a Sustainable Platform for Data Management, Analysis and Remote Visualization of Volumetric Images of Porous Media

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.

    2017-12-01

    Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.

  2. Communicating the Value of an Institutional Repository: Experiences at Ghana's University for Development Studies

    ERIC Educational Resources Information Center

    Thompson, Edwin S.; Akeriwe, Miriam Linda; Aikins, Angela Achia

    2016-01-01

    The quality of research depends greatly on access to existing information. Institutional repositories (IRs) have the potential to enhance and promote the dissemination of knowledge and research. This may lead to discoveries and innovation alongside maximizing return on investment in research and development. Following some background information,…

  3. Pretest reference calculation for the overtest for simulated defense high level waste (WIPP) Room B in situ experiment)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, H.S.; Stone, C.M.

    A pretest reference calculation for the Overtest for Simulated Defense High-Level Waste (DHLW) or Room B experiment is presented in this report. The overtest is one of several large-scale, in-situ experiments currently under construction near Carlsbad, New Mexico at the site of the Waste Isolation Pilot Plant (WIPP). Room B, a single isolated room in the underground salt formation, is to be subjected to a thermal load of approximately four times the areal heat output anticipated for a future repository with DHLW. The load will be supplied 3 years by canister heaters placed in the floor. Room B is heavilymore » instrumented for monitoring both temperature increases due to the thermal loading and deformations due to creep of the salt. Data from the experiment are not available at the present time, but the measurements will eventually be compared to the results presented to assess and improve thermal and mechanical modeling capabilities for the WIPP. The thermal/structural model used here represents the state of the art at the present time. A large number of plots are included since an appropriate result is presented for every Room B gauge location. 81 figs., 4 tabs.« less

  4. The experiment editor: supporting inquiry-based learning with virtual labs

    NASA Astrophysics Data System (ADS)

    Galan, D.; Heradio, R.; de la Torre, L.; Dormido, S.; Esquembre, F.

    2017-05-01

    Inquiry-based learning is a pedagogical approach where students are motivated to pose their own questions when facing problems or scenarios. In physics learning, students are turned into scientists who carry out experiments, collect and analyze data, formulate and evaluate hypotheses, and so on. Lab experimentation is essential for inquiry-based learning, yet there is a drawback with traditional hands-on labs in the high costs associated with equipment, space, and maintenance staff. Virtual laboratories are helpful to reduce these costs. This paper enriches the virtual lab ecosystem by providing an integrated environment to automate experimentation tasks. In particular, our environment supports: (i) scripting and running experiments on virtual labs, and (ii) collecting and analyzing data from the experiments. The current implementation of our environment supports virtual labs created with the authoring tool Easy Java/Javascript Simulations. Since there are public repositories with hundreds of freely available labs created with this tool, the potential applicability to our environment is considerable.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  6. Colloid formation during waste form reaction: Implications for nuclear waste disposal

    USGS Publications Warehouse

    Bates, J. K.; Bradley, J.; Teetsov, A.; Bradley, C. R.; Buchholtz ten Brink, Marilyn R.

    1992-01-01

    Insoluble plutonium- and americium-bearing colloidal particles formed during simulated weathering of a high-level nuclear waste glass. Nearly 100 percent of the total plutonium and americium in test ground water was concentrated in these submicrometer particles. These results indicate that models of actinide mobility and repository integrity, which assume complete solubility of actinides in ground water, underestimate the potential for radionuclide release into the environment. A colloid-trapping mechanism may be necessary for a waste repository to meet long-term performance specifications.

  7. Rapid methods for radionuclide contaminant transport in nuclear fuel cycle simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn

    Here, nuclear fuel cycle and nuclear waste disposal decisions are technologically coupled. However, current nuclear fuel cycle simulators lack dynamic repository performance analysis due to the computational burden of high-fidelity hydrolgic contaminant transport models. The Cyder disposal environment and repository module was developed to fill this gap. It implements medium-fidelity hydrologic radionuclide transport models to support assessment appropriate for fuel cycle simulation in the Cyclus fuel cycle simulator. Rapid modeling of hundreds of discrete waste packages in a geologic environment is enabled within this module by a suite of four closed form models for advective, dispersive, coupled, and idealized con-more » taminant transport: a Degradation Rate model, a Mixed Cell model, a Lumped Parameter model, and a 1-D Permeable Porous Medium model. A summary of the Cyder module, its timestepping algorithm, and the mathematical models implemented within it are presented. Additionally, parametric demonstrations simulations performed with Cyder are presented and shown to demonstrate functional agreement with parametric simulations conducted in a standalone hydrologic transport model, the Clay Generic Disposal System Model developed by the Used Fuel Disposition Campaign Department of Energy Office of Nuclear Energy.« less

  8. Rapid methods for radionuclide contaminant transport in nuclear fuel cycle simulation

    DOE PAGES

    Huff, Kathryn

    2017-08-01

    Here, nuclear fuel cycle and nuclear waste disposal decisions are technologically coupled. However, current nuclear fuel cycle simulators lack dynamic repository performance analysis due to the computational burden of high-fidelity hydrolgic contaminant transport models. The Cyder disposal environment and repository module was developed to fill this gap. It implements medium-fidelity hydrologic radionuclide transport models to support assessment appropriate for fuel cycle simulation in the Cyclus fuel cycle simulator. Rapid modeling of hundreds of discrete waste packages in a geologic environment is enabled within this module by a suite of four closed form models for advective, dispersive, coupled, and idealized con-more » taminant transport: a Degradation Rate model, a Mixed Cell model, a Lumped Parameter model, and a 1-D Permeable Porous Medium model. A summary of the Cyder module, its timestepping algorithm, and the mathematical models implemented within it are presented. Additionally, parametric demonstrations simulations performed with Cyder are presented and shown to demonstrate functional agreement with parametric simulations conducted in a standalone hydrologic transport model, the Clay Generic Disposal System Model developed by the Used Fuel Disposition Campaign Department of Energy Office of Nuclear Energy.« less

  9. Performance Assessment of a Generic Repository in Bedded Salt for DOE-Managed Nuclear Waste

    NASA Astrophysics Data System (ADS)

    Stein, E. R.; Sevougian, S. D.; Hammond, G. E.; Frederick, J. M.; Mariner, P. E.

    2016-12-01

    A mined repository in salt is one of the concepts under consideration for disposal of DOE-managed defense-related spent nuclear fuel (SNF) and high level waste (HLW). Bedded salt is a favorable medium for disposal of nuclear waste due to its low permeability, high thermal conductivity, and ability to self-heal. Sandia's Generic Disposal System Analysis framework is used to assess the ability of a generic repository in bedded salt to isolate radionuclides from the biosphere. The performance assessment considers multiple waste types of varying thermal load and radionuclide inventory, the engineered barrier system comprising the waste packages, backfill, and emplacement drifts, and the natural barrier system formed by a bedded salt deposit and the overlying sedimentary sequence (including an aquifer). The model simulates disposal of nearly the entire inventory of DOE-managed, defense-related SNF (excluding Naval SNF) and HLW in a half-symmetry domain containing approximately 6 million grid cells. Grid refinement captures the detail of 25,200 individual waste packages in 180 disposal panels, associated access halls, and 4 shafts connecting the land surface to the repository. Equations describing coupled heat and fluid flow and reactive transport are solved numerically with PFLOTRAN, a massively parallel flow and transport code. Simulated processes include heat conduction and convection, waste package failure, waste form dissolution, radioactive decay and ingrowth, sorption, solubility limits, advection, dispersion, and diffusion. Simulations are run to 1 million years, and radionuclide concentrations are observed within an aquifer at a point approximately 4 kilometers downgradient of the repository. The software package DAKOTA is used to sample likely ranges of input parameters including waste form dissolution rates and properties of engineered and natural materials in order to quantify uncertainty in predicted concentrations and sensitivity to input parameters. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  10. Clustering molecular dynamics trajectories for optimizing docking experiments.

    PubMed

    De Paris, Renata; Quevedo, Christian V; Ruiz, Duncan D; Norberto de Souza, Osmar; Barros, Rodrigo C

    2015-01-01

    Molecular dynamics simulations of protein receptors have become an attractive tool for rational drug discovery. However, the high computational cost of employing molecular dynamics trajectories in virtual screening of large repositories threats the feasibility of this task. Computational intelligence techniques have been applied in this context, with the ultimate goal of reducing the overall computational cost so the task can become feasible. Particularly, clustering algorithms have been widely used as a means to reduce the dimensionality of molecular dynamics trajectories. In this paper, we develop a novel methodology for clustering entire trajectories using structural features from the substrate-binding cavity of the receptor in order to optimize docking experiments on a cloud-based environment. The resulting partition was selected based on three clustering validity criteria, and it was further validated by analyzing the interactions between 20 ligands and a fully flexible receptor (FFR) model containing a 20 ns molecular dynamics simulation trajectory. Our proposed methodology shows that taking into account features of the substrate-binding cavity as input for the k-means algorithm is a promising technique for accurately selecting ensembles of representative structures tailored to a specific ligand.

  11. Albedo Neutron Dosimetry in a Deep Geological Disposal Repository for High-Level Nuclear Waste.

    PubMed

    Pang, Bo; Becker, Frank

    2017-04-28

    Albedo neutron dosemeter is the German official personal neutron dosemeter in mixed radiation fields where neutrons contribute to personal dose. In deep geological repositories for high-level nuclear waste, where neutrons can dominate the radiation field, it is of interest to investigate the performance of albedo neutron dosemeter in such facilities. In this study, the deep geological repository is represented by a shielding cask loaded with spent nuclear fuel placed inside a rock salt emplacement drift. Due to the backscattering of neutrons in the drift, issues concerning calibration of the dosemeter arise. Field-specific calibration of the albedo neutron dosemeter was hence performed with Monte Carlo simulations. In order to assess the applicability of the albedo neutron dosemeter in a deep geological repository over a long time scale, spent nuclear fuel with different ages of 50, 100 and 500 years were investigated. It was found out, that the neutron radiation field in a deep geological repository can be assigned to the application area 'N1' of the albedo neutron dosemeter, which is typical in reactors and accelerators with heavy shielding. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Radiation induced corrosion of copper for spent nuclear fuel storage

    NASA Astrophysics Data System (ADS)

    Björkbacka, Åsa; Hosseinpour, Saman; Johnson, Magnus; Leygraf, Christofer; Jonsson, Mats

    2013-11-01

    The long term safety of repositories for radioactive waste is one of the main concerns for countries utilizing nuclear power. The integrity of engineered and natural barriers in such repositories must be carefully evaluated in order to minimize the release of radionuclides to the biosphere. One of the most developed concepts of long term storage of spent nuclear fuel is the Swedish KBS-3 method. According to this method, the spent fuel will be sealed inside copper canisters surrounded by bentonite clay and placed 500 m down in stable bedrock. Despite the importance of the process of radiation induced corrosion of copper, relatively few studies have been reported. In this work the effect of the total gamma dose on radiation induced corrosion of copper in anoxic pure water has been studied experimentally. Copper samples submerged in water were exposed to a series of total doses using three different dose rates. Unirradiated samples were used as reference samples throughout. The copper surfaces were examined qualitatively using IRAS and XPS and quantitatively using cathodic reduction. The concentration of copper in solution after irradiation was measured using ICP-AES. The influence of aqueous radiation chemistry on the corrosion process was evaluated based on numerical simulations. The experiments show that the dissolution as well as the oxide layer thickness increase upon radiation. Interestingly, the evaluation using numerical simulations indicates that aqueous radiation chemistry is not the only process driving the corrosion of copper in these systems.

  13. New directions in medical e-curricula and the use of digital repositories.

    PubMed

    Fleiszer, David M; Posel, Nancy H; Steacy, Sean P

    2004-03-01

    Medical educators involved in the growth of multimedia-enhanced e-curricula are increasingly aware of the need for digital repositories to catalogue, store and ensure access to learning objects that are integrated within their online material. The experience at the Faculty of Medicine at McGill University during initial development of a mainstream electronic curriculum reflects this growing recognition that repositories can facilitate the development of a more comprehensive as well as effective electronic curricula. Also, digital repositories can help to ensure efficient utilization of resources through the use, re-use, and reprocessing of multimedia learning, addressing the potential for collaboration among repositories and increasing available material exponentially. The authors review different approaches to the development of a digital repository application, as well as global and specific issues that should be examined in the initial requirements definition and development phase, to ensure current initiatives meet long-term requirements. Often, decisions regarding creation of e-curricula and associated digital repositories are left to interested faculty and their individual development teams. However, the development of an e-curricula and digital repository is not predominantly a technical exercise, but rather one that affects global pedagogical strategies and curricular content and involves a commitment of large-scale resources. Outcomes of these decisions can have long-term consequences and as such, should involve faculty at the highest levels including the dean.

  14. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  15. KUTE-BASE: storing, downloading and exporting MIAME-compliant microarray experiments in minutes rather than hours.

    PubMed

    Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto

    2008-03-01

    The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.

  16. Methods for Probabilistic Radiological Dose Assessment at a High-Level Radioactive Waste Repository.

    NASA Astrophysics Data System (ADS)

    Maheras, Steven James

    Methods were developed to assess and evaluate the uncertainty in offsite and onsite radiological dose at a high-level radioactive waste repository to show reasonable assurance that compliance with applicable regulatory requirements will be achieved. Uncertainty in offsite dose was assessed by employing a stochastic precode in conjunction with Monte Carlo simulation using an offsite radiological dose assessment code. Uncertainty in onsite dose was assessed by employing a discrete-event simulation model of repository operations in conjunction with an occupational radiological dose assessment model. Complementary cumulative distribution functions of offsite and onsite dose were used to illustrate reasonable assurance. Offsite dose analyses were performed for iodine -129, cesium-137, strontium-90, and plutonium-239. Complementary cumulative distribution functions of offsite dose were constructed; offsite dose was lognormally distributed with a two order of magnitude range. However, plutonium-239 results were not lognormally distributed and exhibited less than one order of magnitude range. Onsite dose analyses were performed for the preliminary inspection, receiving and handling, and the underground areas of the repository. Complementary cumulative distribution functions of onsite dose were constructed and exhibited less than one order of magnitude range. A preliminary sensitivity analysis of the receiving and handling areas was conducted using a regression metamodel. Sensitivity coefficients and partial correlation coefficients were used as measures of sensitivity. Model output was most sensitive to parameters related to cask handling operations. Model output showed little sensitivity to parameters related to cask inspections.

  17. Principles of Product Quality Control of German Radioactive Waste Forms from the Reprocessing of Spent Fuel: Vitrification, Compaction and Numerical Simulation - 12529

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tietze-Jaensch, Holger; Schneider, Stephan; Aksyutina, Yuliya

    2012-07-01

    The German product quality control is inter alia responsible for control of two radioactive waste forms of heat generating waste: a) homogeneous vitrified HLW and b) heterogeneous compacted hulls, end-pieces and technological metallic waste. In either case, significantly different metrology is employed at the site of the conditioning plant for the obligatory nuclide inventory declaration. To facilitate an independent evaluation and checking of the accompanying documentation numerical simulations are carried out. The physical and chemical properties of radioactive waste residues are used to assess the data consistency and uncertainty margins, as well as to predict the long-term behavior of themore » radioactive waste. This is relevant for repository acceptance and safety considerations. Our new numerical approach follows a bottom-up simulation starting from the burn-up behavior of the fuel elements in the reactor core. The output of these burn-up calculations is then coupled with a program that simulates the material separation in the subsequent dissolution and extraction processes normalized to the mass balance. Follow-up simulations of the separated reprocessing lines of a) the vitrification of highly-active liquid and b) the compaction of residual intermediate-active metallic hulls remaining after fuel pellets dissolution, end-pieces and technological waste, allows calculating expectation values for the various repository relevant properties of either waste stream. The principles of the German product quality control of radioactive waste residues from the spent fuel reprocessing have been introduced and explained. Namely, heat generating homogeneous vitrified HLW and heterogeneous compacted metallic MLW have been discussed. The advantages of a complementary numerical property simulation have been made clear and examples of benefits are presented. We have compiled a new program suite to calculate the physical and radio-chemical properties of common nuclear waste residues. The immediate benefit is the independent assessment of radio-active inventory declarations and much facilitated product quality control of waste residues that need to be returned to Germany and submitted to a German HLW-repository requirements. Wherever possible, internationally accepted standard programs are used and embedded. The innovative coupling of burn-up calculations (SCALE) with neutron and gamma transport codes (MCPN-X) allows an application in the world of virtual waste properties. If-then-else scenarios of hypothetical waste material compositions and distributions provide valuable information of long term nuclide property propagation under repository conditions over a very long time span. Benchmarking the program with real residue data demonstrates the power and remarkable accuracy of this numerical approach, boosting the reliability of the confidence aforementioned numerous applications, namely the proof tool set for on-the-spot production quality checking and data evaluation and independent verification. Moreover, using the numerical bottom-up approach helps to avoid the accumulation of fake activities that may gradually build up in a repository from the so-called conservative or penalizing nuclide inventory declarations. The radioactive waste properties and the hydrolytic and chemical stability can be predicted. The interaction with invasive chemicals can be assessed and propagation scenarios can be developed from reliable and sound data and HLW properties. Hence, the appropriate design of a future HLW repository can be based upon predictable and quality assured waste characteristics. (authors)« less

  18. Reproducibility in Computational Neuroscience Models and Simulations

    PubMed Central

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  19. Beyond the Repository: A Mixed Method Approach to Providing Access to Collections Online

    ERIC Educational Resources Information Center

    Garrison, Brian Wade

    2013-01-01

    After providing access to over 100 video interviews conducted by a professor with notable entertainers and personalities from film through an institutional repository, an experiment was conducted to discover whether a larger audience could be gained by adding a subset of 32 of these videos to YouTube. The results, over 400,000 views, indicate that…

  20. Probablistic Analyses of Waste Package Quantities Impacted by Potential Igneous Disruption at Yucca Mountain

    NASA Astrophysics Data System (ADS)

    Wallace, M. G.; Iuzzolina, H.

    2005-12-01

    A probabilistic analysis was conducted to estimate ranges for the numbers of waste packages that could be damaged in a potential future igneous event through a repository at Yucca Mountain. The analysis includes disruption from an intrusive igneous event and from an extrusive volcanic event. This analysis supports the evaluation of the potential consequences of future igneous activity as part of the total system performance assessment for the license application for the Yucca Mountain Project (YMP). The first scenario, igneous intrusion, investigated the case where one or more igneous dikes intersect the repository. A swarm of dikes was characterized by distributions of length, width, azimuth, and number of dikes and the spacings between them. Through the use in part of a latin hypercube simulator and a modified video game engine, mathematical relationships were built between those parameters and the number of waste packages hit. Corresponding cumulative distribution function curves (CDFs) for the number of waste packages hit under several different scenarios were calculated. Variations in dike thickness ranges, as well as in repository magma bulkhead positions were examined through sensitivity studies. It was assumed that all waste packages in an emplacement drift would be impacted if that drift was intersected by a dike. Over 10,000 individual simulations were performed. Based on these calculations, out of a total of over 11,000 planned waste packages distributed over an area of approximately 5.5 km2 , the median number of waste packages impacted was roughly 1/10 of the total. Individual cases ranged from 0 waste packages to the entire inventory being impacted. The igneous intrusion analysis involved an explicit characterization of dike-drift intersections, built upon various distributions that reflect the uncertainties associated with the inputs. The second igneous scenario, volcanic eruption (eruptive conduits), considered the effects of conduits formed in association with a volcanic eruption through the repository. Mathematical relations were built between the resulting conduit areas and the fraction of the repository area occupied by waste packages. This relation was used in conjunction with a joint distribution incorporating variability in eruptive conduit diameters and in the number of eruptive conduits that could intersect the repository.

  1. OntoVIP: an ontology for the annotation of object models used for medical image simulation.

    PubMed

    Gibaud, Bernard; Forestier, Germain; Benoit-Cattin, Hugues; Cervenansky, Frédéric; Clarysse, Patrick; Friboulet, Denis; Gaignard, Alban; Hugonnard, Patrick; Lartizien, Carole; Liebgott, Hervé; Montagnat, Johan; Tabary, Joachim; Glatard, Tristan

    2014-12-01

    This paper describes the creation of a comprehensive conceptualization of object models used in medical image simulation, suitable for major imaging modalities and simulators. The goal is to create an application ontology that can be used to annotate the models in a repository integrated in the Virtual Imaging Platform (VIP), to facilitate their sharing and reuse. Annotations make the anatomical, physiological and pathophysiological content of the object models explicit. In such an interdisciplinary context we chose to rely on a common integration framework provided by a foundational ontology, that facilitates the consistent integration of the various modules extracted from several existing ontologies, i.e. FMA, PATO, MPATH, RadLex and ChEBI. Emphasis is put on methodology for achieving this extraction and integration. The most salient aspects of the ontology are presented, especially the organization in model layers, as well as its use to browse and query the model repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Documenting Climate Models and Their Simulations

    DOE PAGES

    Guilyardi, Eric; Balaji, V.; Lawrence, Bryan; ...

    2013-05-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now nonspecialists such as government officials, policy makers, and the general public all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. We describe a pilot community initiative to collect and make available documentation of climatemore » models and their simulations. In an initial application, a metadata repository is being established to provide information of this kind for a major internationally coordinated modeling activity known as CMIP5 (Coupled Model Intercomparison Project, Phase 5). We expected that for a wide range of stakeholders, this and similar community-managed metadata repositories will spur development of analysis tools that facilitate discovery and exploitation of Earth system simulations.« less

  3. Terminology development towards harmonizing multiple clinical neuroimaging research repositories.

    PubMed

    Turner, Jessica A; Pasquerello, Danielle; Turner, Matthew D; Keator, David B; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D; Potkin, Steven G; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-07-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories.

  4. Terminology development towards harmonizing multiple clinical neuroimaging research repositories

    PubMed Central

    Turner, Jessica A.; Pasquerello, Danielle; Turner, Matthew D.; Keator, David B.; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D.; Potkin, Steven G.; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-01-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories. PMID:26688838

  5. Modeling transient heat transfer in nuclear waste repositories.

    PubMed

    Yang, Shaw-Yang; Yeh, Hund-Der

    2009-09-30

    The heat of high-level nuclear waste may be generated and released from a canister at final disposal sites. The waste heat may affect the engineering properties of waste canisters, buffers, and backfill material in the emplacement tunnel and the host rock. This study addresses the problem of the heat generated from the waste canister and analyzes the heat distribution between the buffer and the host rock, which is considered as a radial two-layer heat flux problem. A conceptual model is first constructed for the heat conduction in a nuclear waste repository and then mathematical equations are formulated for modeling heat flow distribution at repository sites. The Laplace transforms are employed to develop a solution for the temperature distributions in the buffer and the host rock in the Laplace domain, which is numerically inverted to the time-domain solution using the modified Crump method. The transient temperature distributions for both the single- and multi-borehole cases are simulated in the hypothetical geological repositories of nuclear waste. The results show that the temperature distributions in the thermal field are significantly affected by the decay heat of the waste canister, the thermal properties of the buffer and the host rock, the disposal spacing, and the thickness of the host rock at a nuclear waste repository.

  6. Experiments and Modeling in Support of Generic Salt Repository Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bourret, Suzanne Michelle; Stauffer, Philip H.; Weaver, Douglas James

    Salt is an attractive material for the disposition of heat generating nuclear waste (HGNW) because of its self-sealing, viscoplastic, and reconsolidation properties (Hansen and Leigh, 2012). The rate at which salt consolidates and the properties of the consolidated salt depend on the composition of the salt, including its content in accessory minerals and moisture, and the temperature under which consolidation occurs. Physicochemical processes, such as mineral hydration/dehydration salt dissolution and precipitation play a significant role in defining the rate of salt structure changes. Understanding the behavior of these complex processes is paramount when considering safe design for disposal of heat-generatingmore » nuclear waste (HGNW) in salt formations, so experimentation and modeling is underway to characterize these processes. This report presents experiments and simulations in support of the DOE-NE Used Fuel Disposition Campaign (UFDC) for development of drift-scale, in-situ field testing of HGNW in salt formations.« less

  7. NUTRITION w/Repository

    NASA Image and Video Library

    2009-06-06

    ISS020-E-007566 (6 June 2009) --- European Space Agency astronaut Frank De Winne, Expedition 20 flight engineer, prepares to put samples in the Minus Eighty Laboratory Freezer for ISS (MELFI) in the Kibo laboratory of the International Space Station. Samples were taken as part of the Nutritional Status Assessment (Nutrition) with Repository experiment, a study done by NASA to date of human physiologic changes during long-duration spaceflight.

  8. Models for Evaluating and Improving Architecture Competence

    DTIC Science & Technology

    2008-03-01

    learned better methods than it engaged in the past. 36 | CMU/SEI-2008-TR-006 SOFTWARE ENGINEERING INSTITUTE | 37 6 Considering the Models ...and groups must have a repository of ac- cumulated knowledge and experience. The Organizational Learning model provides a way to eva- luate how...effective that repository is. It also tells us how ―mindful‖ the learning needs to be. The organizational coordination model

  9. ORIOLE, in the Search for Evidence of OER in Teaching. Experiences in the Use, Re-Use and the Sharing and Influence of Repositories

    ERIC Educational Resources Information Center

    Santos-Hermosa, Gema

    2014-01-01

    The study presented here aims to gather useful information on the use, re-reuse and sharing of resources in Education and also the influence of repositories, to better understand the perspective of individual practitioners and suggest future areas of debate for researchers. Open Resources: Influence on Learners and Educators (ORIOLE) project, was…

  10. Nuclear Repository steel canister: experimental corrosion rates

    NASA Astrophysics Data System (ADS)

    Caporuscio, F.; Norskog, K.

    2017-12-01

    The U.S. Spent Fuel & Waste Science & Technology campaign evaluates various generic geological repositories for the disposal of spent nuclear fuel. This experimental work analyzed and characterized the canister corrosion and steel interface mineralogy of bentonite-based EBS 304 stainless steel (SS), 316 SS, and low-carbon steel coupons in brine at higher heat loads and pressures. Experiments contrasted EBS with and without an argillite wall rock. Unprocessed bentonite from Colony, Wyoming simulated the clay buffer and Opalinus Clay represented the wall rock. Redox conditions were buffered at the magnetite-iron oxygen fugacity univariant curve. A K-Na-Ca-Cl-based brine was chosen to replicate generic granitic groundwater compositions, while Opalinous Clay groundwater was used in the wall rock series of experiments. Most experiments were run at 150 bar and 300°C for 4 to 6 weeks and one was held at elevated conditions for 6 months. The two major experimental mixtures were 1) brine-bentonite clay- steel, and 2) brine-bentonite clay-Opalinus Clay-steel. Both systems were equilibrated at a high liquid/clay ratio. Mineralogy and aqueous geochemistry of each experiment were evaluated to monitor the reactions that took place. In total 4291 measurements were obtained: 2500 measured steel corrosion depths and 1791 were of phyllosilicate mineral reactions/growths at the interface. The low carbon steel corrosion mechanism was via pit corrosion, while 304 SS and 316 SS were by general corrosion. The low carbon steel corrosion rate (1.95 μm/day) was most rapid. The 304 SS corrosion rate (0.37 μm/day) was slightly accelerated versus the 316 SS corrosion rate (0.26 μm/day). Note that the six month 316 SS experiment shows inhibited corrosion rates (0.07 μm/day). This may be in part due to mantling by the Fe-saponite/chlorite authigenic minerals. All phyllosilicate growth rates at the interface exhibit similar growth rate patterns to the steels (i.e. LCS>304>316> 316 six month).

  11. Digital Rocks Portal: Preservation, Sharing, Remote Visualization and Automated Analysis of Imaged Datasets

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.; Hanlon, M.; Pettengill, M.; Ranganath, A.; Venkatesh, A.

    2016-12-01

    Due to advances in imaging modalities such as X-ray microtomography and scattered electron microscopy, 2D and 3D imaged datasets of rock microstructure on nanometer to centimeter length scale allow investigation of nonlinear flow and mechanical phenomena using numerical approaches. This in turn produces various upscaled parameters required by subsurface flow and deformation simulators. However, a single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (http://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Our objective is to enable scientific inquiry and engineering decisions founded on a data-driven basis. We show how the data loaded in the portal can be documented, referenced in publications via digital object identifiers, visualize and linked to other repositories. We then show preliminary results on integrating remote parallel visualization and flow simulation workflow with the pore structures currently stored in the repository. We finally discuss the issues of collecting correct metadata, data discoverability and repository sustainability. This is the first repository for this particular data, but is part of the wider ecosystem of geoscience data and model cyber-infrastructure called "Earthcube" (http://earthcube.org/) sponsored by National Science Foundation. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  12. Simulated Students and Classroom Use of Model-Based Intelligent Tutoring

    NASA Technical Reports Server (NTRS)

    Koedinger, Kenneth R.

    2008-01-01

    Two educational uses of models and simulations: 1) Students create models and use simulations ; and 2) Researchers create models of learners to guide development of reliably effective materials. Cognitive tutors simulate and support tutoring - data is crucial to create effective model. Pittsburgh Science of Learning Center: Resources for modeling, authoring, experimentation. Repository of data and theory. Examples of advanced modeling efforts: SimStudent learns rule-based model. Help-seeking model: Tutors metacognition. Scooter uses machine learning detectors of student engagement.

  13. A scoping review of online repositories of quality improvement projects, interventions and initiatives in healthcare.

    PubMed

    Bytautas, Jessica P; Gheihman, Galina; Dobrow, Mark J

    2017-04-01

    Quality improvement (QI) is becoming an important focal point for health systems. There is increasing interest among health system stakeholders to learn from and share experiences on the use of QI methods and approaches in their work. Yet there are few easily accessible, online repositories dedicated to documenting QI activity. We conducted a scoping review of publicly available, web-based QI repositories to (i) identify current approaches to sharing information on QI practices; (ii) categorise these approaches based on hosting, scope and size, content acquisition and eligibility, content format and search, and evaluation and engagement characteristics; and (iii) review evaluations of the design, usefulness and impact of their online QI practice repositories. The search strategy consisted of traditional database and grey literature searches, as well as expert consultation, with the ultimate aim of identifying and describing QI repositories of practices undertaken in a healthcare context. We identified 13 QI repositories and found substantial variation across the five categories. The QI repositories used different terminology (eg, practices vs case studies) and approaches to content acquisition, and varied in terms of primary areas of focus. All provided some means for organising content according to categories or themes and most provided at least rudimentary keyword search functionality. Notably, none of the QI repositories included evaluations of their impact. With growing interest in sharing and spreading best practices and increasing reliance on QI as a key contributor to health system performance, the role of QI repositories is likely to expand. Designing future QI repositories based on knowledge of the range and type of features available is an important starting point for improving their usefulness and impact. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. Semantic Web repositories for genomics data using the eXframe platform.

    PubMed

    Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna

    2014-01-01

    With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.

  15. NUTRITION w/Repository

    NASA Image and Video Library

    2009-06-06

    ISS020-E-007577 (6 June 2009) --- European Space Agency astronaut Frank De Winne, Expedition 20 flight engineer, returns a dewar tray to the Minus Eighty Laboratory Freezer for ISS (MELFI) after inserting biological samples into the trays in the Kibo laboratory of the International Space Station. Samples were taken as part of the Nutritional Status Assessment (Nutrition) with Repository experiment, a study done by NASA to date of human physiologic changes during long-duration spaceflight.

  16. Clustering Molecular Dynamics Trajectories for Optimizing Docking Experiments

    PubMed Central

    De Paris, Renata; Quevedo, Christian V.; Ruiz, Duncan D.; Norberto de Souza, Osmar; Barros, Rodrigo C.

    2015-01-01

    Molecular dynamics simulations of protein receptors have become an attractive tool for rational drug discovery. However, the high computational cost of employing molecular dynamics trajectories in virtual screening of large repositories threats the feasibility of this task. Computational intelligence techniques have been applied in this context, with the ultimate goal of reducing the overall computational cost so the task can become feasible. Particularly, clustering algorithms have been widely used as a means to reduce the dimensionality of molecular dynamics trajectories. In this paper, we develop a novel methodology for clustering entire trajectories using structural features from the substrate-binding cavity of the receptor in order to optimize docking experiments on a cloud-based environment. The resulting partition was selected based on three clustering validity criteria, and it was further validated by analyzing the interactions between 20 ligands and a fully flexible receptor (FFR) model containing a 20 ns molecular dynamics simulation trajectory. Our proposed methodology shows that taking into account features of the substrate-binding cavity as input for the k-means algorithm is a promising technique for accurately selecting ensembles of representative structures tailored to a specific ligand. PMID:25873944

  17. Modelling geochemical and microbial consumption of dissolved oxygen after backfilling a high level radiactive waste repository.

    PubMed

    Yang, Changbing; Samper, Javier; Molinero, Jorge; Bonilla, Mercedes

    2007-08-15

    Dissolved oxygen (DO) left in the voids of buffer and backfill materials of a deep geological high level radioactive waste (HLW) repository could cause canister corrosion. Available data from laboratory and in situ experiments indicate that microbes play a substantial role in controlling redox conditions near a HLW repository. This paper presents the application of a coupled hydro-bio-geochemical model to evaluate geochemical and microbial consumption of DO in bentonite porewater after backfilling of a HLW repository designed according to the Swedish reference concept. In addition to geochemical reactions, the model accounts for dissolved organic carbon (DOC) respiration and methane oxidation. Parameters for microbial processes were derived from calibration of the REX in situ experiment carried out at the Aspö underground laboratory. The role of geochemical and microbial processes in consuming DO is evaluated for several scenarios. Numerical results show that both geochemical and microbial processes are relevant for DO consumption. However, the time needed to consume the DO trapped in the bentonite buffer decreases dramatically from several hundreds of years when only geochemical processes are considered to a few weeks when both geochemical reactions and microbially-mediated DOC respiration and methane oxidation are taken into account simultaneously.

  18. Calculating the quality of public high-throughput sequencing data to obtain a suitable subset for reanalysis from the Sequence Read Archive

    PubMed Central

    Nakazato, Takeru; Bono, Hidemasa

    2017-01-01

    Abstract It is important for public data repositories to promote the reuse of archived data. In the growing field of omics science, however, the increasing number of submissions of high-throughput sequencing (HTSeq) data to public repositories prevents users from choosing a suitable data set from among the large number of search results. Repository users need to be able to set a threshold to reduce the number of results to obtain a suitable subset of high-quality data for reanalysis. We calculated the quality of sequencing data archived in a public data repository, the Sequence Read Archive (SRA), by using the quality control software FastQC. We obtained quality values for 1 171 313 experiments, which can be used to evaluate the suitability of data for reuse. We also visualized the data distribution in SRA by integrating the quality information and metadata of experiments and samples. We provide quality information of all of the archived sequencing data, which enable users to obtain sufficient quality sequencing data for reanalyses. The calculated quality data are available to the public in various formats. Our data also provide an example of enhancing the reuse of public data by adding metadata to published research data by a third party. PMID:28449062

  19. Analysis of water flow paths: methodology and example calculations for a potential geological repository in Sweden.

    PubMed

    Werner, Kent; Bosson, Emma; Berglund, Sten

    2006-12-01

    Safety assessment related to the siting of a geological repository for spent nuclear fuel deep in the bedrock requires identification of potential flow paths and the associated travel times for radionuclides originating at repository depth. Using the Laxemar candidate site in Sweden as a case study, this paper describes modeling methodology, data integration, and the resulting water flow models, focusing on the Quaternary deposits and the upper 150 m of the bedrock. Example simulations identify flow paths to groundwater discharge areas and flow paths in the surface system. The majority of the simulated groundwater flow paths end up in the main surface waters and along the coastline, even though the particles used to trace the flow paths are introduced with a uniform spatial distribution at a relatively shallow depth. The calculated groundwater travel time, determining the time available for decay and retention of radionuclides, is on average longer to the coastal bays than to other biosphere objects at the site. Further, it is demonstrated how GIS-based modeling can be used to limit the number of surface flow paths that need to be characterized for safety assessment. Based on the results, the paper discusses an approach for coupling the present models to a model for groundwater flow in the deep bedrock.

  20. gemcWeb: A Cloud Based Nuclear Physics Simulation Software

    NASA Astrophysics Data System (ADS)

    Markelon, Sam

    2017-09-01

    gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.

  1. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...

  2. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...

  3. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...

  4. 10 CFR 60.44 - Changes, tests, and experiments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Changes, tests, and experiments. 60.44 Section 60.44... REPOSITORIES Licenses License Issuance and Amendment § 60.44 Changes, tests, and experiments. (a)(1) Following... experiments not described in the application, without prior Commission approval, provided the change, test, or...

  5. Operational Tsunami Modelling with TsunAWI for the German-Indonesian Tsunami Early Warning System: Recent Developments

    NASA Astrophysics Data System (ADS)

    Rakowsky, N.; Harig, S.; Androsov, A.; Fuchs, A.; Immerz, A.; Schröter, J.; Hiller, W.

    2012-04-01

    Starting in 2005, the GITEWS project (German-Indonesian Tsunami Early Warning System) established from scratch a fully operational tsunami warning system at BMKG in Jakarta. Numerical simulations of prototypic tsunami scenarios play a decisive role in a priori risk assessment for coastal regions and in the early warning process itself. Repositories with currently 3470 regional tsunami scenarios for GITEWS and 1780 Indian Ocean wide scenarios in support of Indonesia as a Regional Tsunami Service Provider (RTSP) were computed with the non-linear shallow water modell TsunAWI. It is based on a finite element discretisation, employs unstructured grids with high resolution along the coast and includes inundation. This contribution gives an overview on the model itself, the enhancement of the model physics, and the experiences gained during the process of establishing an operational code suited for thousands of model runs. Technical aspects like computation time, disk space needed for each scenario in the repository, or post processing techniques have a much larger impact than they had in the beginning when TsunAWI started as a research code. Of course, careful testing on artificial benchmarks and real events remains essential, but furthermore, quality control for the large number of scenarios becomes an important issue.

  6. NUTRITION w/Repository

    NASA Image and Video Library

    2009-06-06

    ISS020-E-007603 (7 June 2009) --- European Space Agency astronaut Frank De Winne, Expedition 20 flight engineer, removes a dewar tray from the Minus Eighty Laboratory Freezer for ISS (MELFI) in order to insert biological samples into the trays in the Kibo laboratory of the International Space Station. Samples were taken as part of the Nutritional Status Assessment (Nutrition) with Repository experiment, a study done by NASA to date of human physiologic changes during long-duration spaceflight.

  7. Using neural networks in software repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.

    1992-01-01

    The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.

  8. Coupled THMC models for bentonite in clay repository for nuclear waste

    NASA Astrophysics Data System (ADS)

    Zheng, L.; Rutqvist, J.; Birkholzer, J. T.; Li, Y.; Anguiano, H. H.

    2015-12-01

    Illitization, the transformation of smectite to illite, could compromise some beneficiary features of an engineered barrier system (EBS) that is composed primarily of bentonite and clay host rock. It is a major determining factor to establish the maximum design temperature of the repositories because it is believed that illitization could be greatly enhanced at temperatures higher than 100 oC and thus significantly lower the sorption and swelling capacity of bentonite and clay rock. However, existing experimental and modeling studies on the occurrence of illitization and related performance impacts are not conclusive, in part because the relevant couplings between the thermal, hydrological, chemical, and mechanical (THMC) processes have not been fully represented in the models. Here we present fully coupled THMC simulations of a generic nuclear waste repository in a clay formation with bentonite-backfilled EBS. Two scenarios were simulated for comparison: a case in which the temperature in the bentonite near the waste canister can reach about 200 oC and a case in which the temperature in the bentonite near the waste canister peaks at about 100 oC. The model simulations demonstrate that illitization is in general more significant at higher temperatures. We also compared the chemical changes and the resulting swelling stress change for two types of bentonite: Kunigel-VI and FEBEX bentonite. Higher temperatures also lead to much higher stress in the near field, caused by thermal pressurization and vapor pressure buildup in the EBS bentonite and clay host rock. Chemical changes lead to a reduction in swelling stress, which is more pronounced for Kunigel-VI bentonite than for FEBEX bentonite.

  9. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  10. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  11. Long-term hydrodynamic response induced by past climatic and geomorphologic forcing: The case of the Paris basin, France

    NASA Astrophysics Data System (ADS)

    Jost, A.; Violette, S.; Gonçalvès, J.; Ledoux, E.; Guyomard, Y.; Guillocheau, F.; Kageyama, M.; Ramstein, G.; Suc, J.-P.

    In the framework of safe underground storage of radioactive waste in low-permeability layers, it is essential to evaluate the mobility of deep groundwaters over timescales of several million years. On these timescales, the environmental evolution of a repository should depend upon a range of natural processes that are primarily driven by climate and geomorphologic variations. In this paper, the response of the Paris basin groundwater system to variations in its hydrodynamic boundary conditions induced by past climate and geodynamic changes over the last five million years is investigated. A three-dimensional transient modelling of the Paris basin aquifer/aquitard system was developed using the code NEWSAM (Ecole des Mines de Paris, ENSMP). The geometry and hydrodynamic parameters of the model originate from a basin model, NEWBAS (ENSMP), built to simulate the geological history of the basin. Geomorphologic evolution is deduced from digital elevation model analysis, which allows to estimate river-valley incision and alpine surrection. Climate forcing results from palaeoclimate modelling experiments using the LMDz atmospheric general circulation model (Institut Pierre Simon Laplace) with a refined spatial resolution, for the present, the Last Glacial Maximum (21 ka) and the Middle Pliocene Warmth (˜3 Ma). The water balance is computed by the distributed hydrological model MODSUR (ENSMP). Results about the simulated evolution of piezometric heads in the system in response to the altered boundary conditions are presented, in particular in the vicinity of ANDRA’s Bure potential repository site within the Callovo-Oxfordian argillaceous layer. For the present, the comparison of head patterns between steady state and time dependent simulation shows little differences for aquifer layers close to the surface but suggests a transient state of the current system in the main aquitards of the basin and in the deep aquifers, characterized by abnormally low fluid potentials. The dependence of the boundary-induced transient effects on the hydraulic diffusivity is illustrated by means of a sensitivity study.

  12. Protein Simulation Data in the Relational Model.

    PubMed

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  13. Protein Simulation Data in the Relational Model

    PubMed Central

    Simms, Andrew M.; Daggett, Valerie

    2011-01-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646

  14. The Use of Underground Research Laboratories to Support Repository Development Programs. A Roadmap for the Underground Research Facilities Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, Robert J.

    2015-10-26

    Under the auspices of the International Atomic Energy Agency (IAEA), nationally developed underground research laboratories (URLs) and associated research institutions are being offered for use by other nations. These facilities form an Underground Research Facilities (URF) Network for training in and demonstration of waste disposal technologies and the sharing of knowledge and experience related to geologic repository development, research, and engineering. In order to achieve its objectives, the URF Network regularly sponsors workshops and training events related to the knowledge base that is transferable between existing URL programs and to nations with an interest in developing a new URL. Thismore » report describes the role of URLs in the context of a general timeline for repository development. This description includes identification of key phases and activities that contribute to repository development as a repository program evolves from an early research and development phase to later phases such as construction, operations, and closure. This information is cast in the form of a matrix with the entries in this matrix forming the basis of the URF Network roadmap that will be used to identify and plan future workshops and training events.« less

  15. Semantic Web repositories for genomics data using the eXframe platform

    PubMed Central

    2014-01-01

    Background With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. Methods To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Conclusions Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge. PMID:25093072

  16. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  17. Calculating the quality of public high-throughput sequencing data to obtain a suitable subset for reanalysis from the Sequence Read Archive.

    PubMed

    Ohta, Tazro; Nakazato, Takeru; Bono, Hidemasa

    2017-06-01

    It is important for public data repositories to promote the reuse of archived data. In the growing field of omics science, however, the increasing number of submissions of high-throughput sequencing (HTSeq) data to public repositories prevents users from choosing a suitable data set from among the large number of search results. Repository users need to be able to set a threshold to reduce the number of results to obtain a suitable subset of high-quality data for reanalysis. We calculated the quality of sequencing data archived in a public data repository, the Sequence Read Archive (SRA), by using the quality control software FastQC. We obtained quality values for 1 171 313 experiments, which can be used to evaluate the suitability of data for reuse. We also visualized the data distribution in SRA by integrating the quality information and metadata of experiments and samples. We provide quality information of all of the archived sequencing data, which enable users to obtain sufficient quality sequencing data for reanalyses. The calculated quality data are available to the public in various formats. Our data also provide an example of enhancing the reuse of public data by adding metadata to published research data by a third party. © The Authors 2017. Published by Oxford University Press.

  18. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.

    PubMed

    Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu

    2006-01-11

    Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  19. Metadata management and semantics in microarray repositories.

    PubMed

    Kocabaş, F; Can, T; Baykal, N

    2011-12-01

    The number of microarray and other high-throughput experiments on primary repositories keeps increasing as do the size and complexity of the results in response to biomedical investigations. Initiatives have been started on standardization of content, object model, exchange format and ontology. However, there are backlogs and inability to exchange data between microarray repositories, which indicate that there is a great need for a standard format and data management. We have introduced a metadata framework that includes a metadata card and semantic nets that make experimental results visible, understandable and usable. These are encoded in syntax encoding schemes and represented in RDF (Resource Description Frame-word), can be integrated with other metadata cards and semantic nets, and can be exchanged, shared and queried. We demonstrated the performance and potential benefits through a case study on a selected microarray repository. We concluded that the backlogs can be reduced and that exchange of information and asking of knowledge discovery questions can become possible with the use of this metadata framework.

  20. Enriching text with images and colored light

    NASA Astrophysics Data System (ADS)

    Sekulovski, Dragan; Geleijnse, Gijs; Kater, Bram; Korst, Jan; Pauws, Steffen; Clout, Ramon

    2008-01-01

    We present an unsupervised method to enrich textual applications with relevant images and colors. The images are collected by querying large image repositories and subsequently the colors are computed using image processing. A prototype system based on this method is presented where the method is applied to song lyrics. In combination with a lyrics synchronization algorithm the system produces a rich multimedia experience. In order to identify terms within the text that may be associated with images and colors, we select noun phrases using a part of speech tagger. Large image repositories are queried with these terms. Per term representative colors are extracted using the collected images. Hereto, we either use a histogram-based or a mean shift-based algorithm. The representative color extraction uses the non-uniform distribution of the colors found in the large repositories. The images that are ranked best by the search engine are displayed on a screen, while the extracted representative colors are rendered on controllable lighting devices in the living room. We evaluate our method by comparing the computed colors to standard color representations of a set of English color terms. A second evaluation focuses on the distance in color between a queried term in English and its translation in a foreign language. Based on results from three sets of terms, a measure of suitability of a term for color extraction based on KL Divergence is proposed. Finally, we compare the performance of the algorithm using either the automatically indexed repository of Google Images and the manually annotated Flickr.com. Based on the results of these experiments, we conclude that using the presented method we can compute the relevant color for a term using a large image repository and image processing.

  1. The electrochemistry of carbon steel in simulated concrete pore water in boom clay repository environments

    NASA Astrophysics Data System (ADS)

    MacDonald, D. D.; Saleh, A.; Lee, S. K.; Azizi, O.; Rosas-Camacho, O.; Al-Marzooqi, A.; Taylor, M.

    2011-04-01

    The prediction of corrosion damage of canisters to experimentally inaccessible times is vitally important in assessing various concepts for the disposal of High Level Nuclear Waste. Such prediction can only be made using deterministic models, whose predictions are constrained by the time-invariant natural laws. In this paper, we describe the measurement of experimental electrochemical data that will allow the prediction of damage to the carbon steel overpack of the super container in Belgium's proposed Boom Clay repository by using the Point Defect Model (PDM). PDM parameter values are obtained by optimizing the model on experimental, wide-band electrochemical impedance spectroscopy data.

  2. Illitization within bentonite engineered barrier system in clay repositories for nuclear waste and its effect on the swelling stress: a coupled THMC modeling study

    NASA Astrophysics Data System (ADS)

    Zheng, L.; Rutqvist, J.; Birkholzer, J. T.; Liu, H. H.

    2014-12-01

    Geological repositories for disposal of high-level nuclear waste generally rely on a multi-barrier system to isolate radioactive waste from the biosphere. An engineered barrier system (EBS), which comprises in many design concepts a bentonite backfill, is widely used. Clay formations have been considered as a host rock throughout the world. Illitization, the transformation of smectite to illite, could compromise some beneficiary features of EBS bentonite and clay host rock such as sorption and swelling capacity. It is the major determining factor to establish the maximum design temperature of the repositories because it is believed that illitization could be greatly enhanced at temperatures higher than 100 oC. However, existing experimental and modeling studies on the occurrence of illitization and related performance impacts are not conclusive, in part because the relevant couplings between the thermal, hydrological, chemical, and mechanical (THMC) processes have not been fully represented in the models. Here we present a fully coupled THMC simulation study of a generic nuclear waste repository in a clay formation with a bentonite-backfilled EBS. Two scenarios were simulated for comparison: a case in which the temperature in the bentonite near the waste canister can reach about 200 oC and a case in which the temperature in the bentonite near the waste canister peaks at about 100 oC. The model simulations demonstrate that illitization is in general more significant under higher temperature. However, the quantity of illitization is affected by many chemical factors and therefore varies a great deal. The most important chemical factors are the concentration of K in the pore water as well as the abundance and dissolution rate of K-feldspar. For the particular case and bentonite properties studied, the reduction in swelling stress as a result of chemical changes vary from 2% up to 70% depending on chemical and temperature conditions, and key mechanical parameters. The modeling work is illustrative in light of the relative importance of different processes occurring in EBS bentonite and clay host rock at higher than 100 oC conditions, and could be of greater use when site specific data are available.

  3. The VLAB OER Experience: Modeling Potential-Adopter Student Acceptance

    ERIC Educational Resources Information Center

    Raman, Raghu; Achuthan, Krishnashree; Nedungadi, Prema; Diwakar, Shyam; Bose, Ranjan

    2014-01-01

    Virtual Labs (VLAB) is a multi-institutional Open Educational Resources (OER) initiative, exclusively focused on lab experiments for engineering education. This project envisages building a large OER repository, containing over 1650 virtual experiments mapped to the engineering curriculum. The introduction of VLAB is a paradigm shift in an…

  4. Mountain-Scale Coupled Processes (TH/THC/THM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    The purpose of this Model Report is to document the development of the Mountain-Scale Thermal-Hydrological (TH), Thermal-Hydrological-Chemical (THC), and Thermal-Hydrological-Mechanical (THM) Models and evaluate the effects of coupled TH/THC/THM processes on mountain-scale UZ flow at Yucca Mountain, Nevada. This Model Report was planned in ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.12.7), and was developed in accordance with AP-SIII.10Q, Models. In this Model Report, any reference to ''repository'' means the nuclear waste repository at Yucca Mountain, and any reference to ''drifts'' means the emplacement drifts at the repository horizon. This Model Report provides themore » necessary framework to test conceptual hypotheses for analyzing mountain-scale hydrological/chemical/mechanical changes and predict flow behavior in response to heat release by radioactive decay from the nuclear waste repository at the Yucca Mountain site. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH Model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH Model captures mountain-scale three dimensional (3-D) flow effects, including lateral diversion at the PTn/TSw interface and mountain-scale flow patterns. The Mountain-Scale THC Model evaluates TH effects on water and gas chemistry, mineral dissolution/precipitation, and the resulting impact to UZ hydrological properties, flow and transport. The THM Model addresses changes in permeability due to mechanical and thermal disturbances in stratigraphic units above and below the repository host rock. The Mountain-Scale THM Model focuses on evaluating the changes in 3-D UZ flow fields arising out of thermal stress and rock deformation during and after the thermal periods.« less

  5. Tool Integration and Environment Architectures

    DTIC Science & Technology

    1991-05-01

    include the Interactive Development Environment (IDE) Software Through Pictures (STP), Sabre-C and FrameMaker coalition, and the Verdix Ada Development...System (VADS) APSE, which includes the VADS compiler and choices of CADRE Teamwork or STP and FrameMaker or Interleaf. The key characteristic of...remote procedure execution to achieve a simulation of a homoge- neous repository (i.e., a simulation that the data in a FrameMaker document resides in one

  6. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  7. A comparative study of discrete fracture network and equivalent continuum models for simulating flow and transport in the far field of a hypothetical nuclear waste repository in crystalline host rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadgu, Teklu; Karra, Satish; Kalinina, Elena

    One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. In this paper, we compare DFN and ECM in termsmore » of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km 3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. Finally, we identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.« less

  8. A comparative study of discrete fracture network and equivalent continuum models for simulating flow and transport in the far field of a hypothetical nuclear waste repository in crystalline host rock

    DOE PAGES

    Hadgu, Teklu; Karra, Satish; Kalinina, Elena; ...

    2017-07-28

    One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. In this paper, we compare DFN and ECM in termsmore » of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km 3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. Finally, we identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.« less

  9. A comparative study of discrete fracture network and equivalent continuum models for simulating flow and transport in the far field of a hypothetical nuclear waste repository in crystalline host rock

    NASA Astrophysics Data System (ADS)

    Hadgu, Teklu; Karra, Satish; Kalinina, Elena; Makedonska, Nataliia; Hyman, Jeffrey D.; Klise, Katherine; Viswanathan, Hari S.; Wang, Yifeng

    2017-10-01

    One of the major challenges of simulating flow and transport in the far field of a geologic repository in crystalline host rock is related to reproducing the properties of the fracture network over the large volume of rock with sparse fracture characterization data. Various approaches have been developed to simulate flow and transport through the fractured rock. The approaches can be broadly divided into Discrete Fracture Network (DFN) and Equivalent Continuum Model (ECM). The DFN explicitly represents individual fractures, while the ECM uses fracture properties to determine equivalent continuum parameters. We compare DFN and ECM in terms of upscaled observed transport properties through generic fracture networks. The major effort was directed on making the DFN and ECM approaches similar in their conceptual representations. This allows for separating differences related to the interpretation of the test conditions and parameters from the differences between the DFN and ECM approaches. The two models are compared using a benchmark test problem that is constructed to represent the far field (1 × 1 × 1 km3) of a hypothetical repository in fractured crystalline rock. The test problem setting uses generic fracture properties that can be expected in crystalline rocks. The models are compared in terms of the: 1) effective permeability of the domain, and 2) nonreactive solute breakthrough curves through the domain. The principal differences between the models are mesh size, network connectivity, matrix diffusion and anisotropy. We demonstrate how these differences affect the flow and transport. We identify the factors that should be taken in consideration when selecting an approach most suitable for the site-specific conditions.

  10. Flexible workflow sharing and execution services for e-scientists

    NASA Astrophysics Data System (ADS)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.

  11. Pretest reference calculation for the 18-W/m/sup 2/ Mockup for Defense High-Level Waste (WIPP Room A in situ experiment)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, H.S.; Stone, C.M.

    A pretest reference calculation for the 18-W/m/sup 2/ Mockup for Defense High-Level Waste (DHLW) or Room A experiment is presented in this report. The mockup is one of several large scale in situ experiments currently under construction near Carlsbad, New Mexico, at the site of the Waste Isolation Pilot Plant (WIPP). The 18-W/m/sup 2/ test is an in situ experiment developed to simulate closely the Reference Repository Conditions (RRC) for DHLW in salt. The test consists of three long, parallel rooms (A1, A2, A3) which are heated by canister heaters placed in the floor of each room. These heaters producemore » thermal loading which simulates an areal heat output of 18-W/m/sup 2/ for Room A2, which is the focus of the experiment. This load will be supplied for a period of three years. Rooms A1, A2, and A3 are heavily instrumented for monitoring both temperature increases due to the thermal loading and deformations due to creep of the salt. Data from the experiment are not available at the present time, but the measurements for Room A2 will eventually be compared to the results for Room A2 presented here to assess and improve thermal and mechanical modeling capabilities for the WIPP. The thermal/structural model used here represents the state-of-the-art at the present time. A large number of plots are included since an appropriate result is presented for every Room A2 gauge location. 55 refs., 70 figs., 4 tabs.« less

  12. The Full Scale Seal Experiment - A Seal Industrial Prototype for Cigeo - 13106

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebon, P.; Bosgiraud, J.M.; Foin, R.

    2013-07-01

    The Full Scale Seal (FSS) Experiment is one of various experiments implemented by Andra, within the frame of the Cigeo (the French Deep Geological Repository) Project development, to demonstrate the technical construction feasibility and performance of seals to be constructed, at time of Repository components (shafts, ramps, drifts, disposal vaults) progressive closure. FSS is built inside a drift model fabricated on surface for the purpose. Prior to the scale 1:1 seal construction test, various design tasks are scheduled. They include the engineering work on the drift model to make it fit with the experimental needs, on the various work sequencesmore » anticipated for the swelling clay core emplacement and the concrete containment plugs construction, on the specialized handling tools (and installation equipment) manufactured and delivered for the purpose, and of course on the various swelling clay materials and low pH (below 11) concrete formulations developed for the application. The engineering of the 'seal-as-built' commissioning means (tools and methodology) must also be dealt with. The FSS construction experiment is a technological demonstrator, thus it is not focused on the phenomenological survey (and by consequence, on the performance and behaviour forecast). As such, no hydration (forced or natural) is planned. However, the FSS implementation (in particular via the construction and commissioning activities carried out) is a key milestone in view of comforting phenomenological extrapolation in time and scale. The FSS experiment also allows for qualifying the commissioning methods of a real sealing system in the Repository, as built, at time of industrial operations. (authors)« less

  13. Use of groundwater lifetime expectancy for the performance assessment of a deep geologic waste repository: 1. Theory, illustrations, and implications

    NASA Astrophysics Data System (ADS)

    Cornaton, F. J.; Park, Y.-J.; Normani, S. D.; Sudicky, E. A.; Sykes, J. F.

    2008-04-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, if radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from a repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time that radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport adjoint equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. The risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The utility of the method is illustrated by means of analytical and numerical examples, which focus on the effect of fracture networks on the uncertainty of evaluated lifetime expectancy.

  14. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    PubMed Central

    2011-01-01

    Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807

  15. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  16. Generation and stability of bentonite colloids at the bentonite/granite interface of a deep geological radioactive waste repository.

    PubMed

    Missana, Tiziana; Alonso, Ursula; Turrero, Maria Jesús

    2003-03-01

    The possible mechanisms of colloid generation at the near field/far field interface of a radioactive repository have been investigated by means of novel column experiments simulating the granite/bentonite boundary, both in dynamic and in quasi-static water flow conditions. It has been shown that solid particles and colloids can be detached from the bulk and mobilised by the water flow. The higher the flow rate, the higher the concentration of particles found in the water, according to an erosion process. However, the gel formation and the intrinsic tactoid structure of the clay play an important role in the submicron particle generation even in the compacted clay and in a confined system. In fact, once a bentonite gel is formed, in the regions where the clay is contacted with water, clay colloids can be formed even in quasi-static flow conditions. The potential relevance of these colloids in radionuclide transport has been studied by evaluating their stability in different chemical environments. The coagulation kinetics of natural bentonite colloids was experimentally studied as a function of the ionic strength and pH, by means of time-resolved light scattering techniques. It has been shown that these colloids are very stable in low saline (approximately 1 x 10(-3) M) and alkaline (pH > or = 8) waters. Copyright 2002 Elsevier Science B.V.

  17. Reductive precipitation of neptunium on iron surfaces under anaerobic conditions

    NASA Astrophysics Data System (ADS)

    Yang, H.; Cui, D.; Grolimund, D.; Rondinella, V. V.; Brütsch, R.; Amme, M.; Kutahyali, C.; Wiss, A. T.; Puranen, A.; Spahiu, K.

    2017-12-01

    Reductive precipitation of the radiotoxic nuclide 237Np from nuclear waste on the surface of iron canister material at simulated deep repository conditions was investigated. Pristine polished as well as pre-corroded iron specimens were interacted in a deoxygenated solution containing 10-100 μM Np(V), with 10 mM NaCl and 2 mM NaHCO3 as background electrolytes. The reactivity of each of the two different systems was investigated by analyzing the temporal evolution of the Np concentration in the reservoir. It was observed that pre-oxidized iron specimen with a 40 μm Fe3O4 corrosion layer are considerably more reactive regarding the reduction and immobilization of aqueous Np(V) as compared to pristine polished Fe(0) surfaces. 237Np immobilized by the reactive iron surfaces was characterized by scanning electron microscopy as well as synchrotron-based micro-X-ray fluorescence and X-ray absorption spectroscopy. At the end of experiments, a 5-8 μm thick Np-rich layer was observed to be formed ontop of the Fe3O4 corrosion layer on the iron specimen. The findings from this work are significant in the context of performance assessments of deep geologic repositories using iron as high level radioactive waste (HLW) canister material and are of relevance regarding removing pollutants from contaminated soil or groundwater aquifer systems.

  18. Towards data warehousing and mining of protein unfolding simulation data.

    PubMed

    Berrar, Daniel; Stahl, Frederic; Silva, Candida; Rodrigues, J Rui; Brito, Rui M M; Dubitzky, Werner

    2005-10-01

    The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.

  19. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  20. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    NASA Astrophysics Data System (ADS)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  1. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  2. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE PAGES

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...

    2016-11-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  3. Analytical model for screening potential CO2 repositories

    USGS Publications Warehouse

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  4. Shaping Solutions from Learnings in PAIs: A Blueprint

    ERIC Educational Resources Information Center

    Dosanjh, Nawtej; Jha, Pushkar P.

    2016-01-01

    Purpose: The paper outlines a portal that facilitates learning through sharing of experiences. This flow is between experience sharers and solution seekers in the domain of poverty alleviation interventions (PAIs). Practitioners working on PAIs are often confined to searching from within "lessons learned" repositories and also from…

  5. Provenance Storage, Querying, and Visualization in PBase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kianmajd, Parisa; Ludascher, Bertram; Missier, Paolo

    2015-01-01

    We present PBase, a repository for scientific workflows and their corresponding provenance information that facilitates the sharing of experiments among the scientific community. PBase is interoperable since it uses ProvONE, a standard provenance model for scientific workflows. Workflows and traces are stored in RDF, and with the support of SPARQL and the tree cover encoding, the repository provides a scalable infrastructure for querying the provenance data. Furthermore, through its user interface, it is possible to: visualize workflows and execution traces; visualize reachability relations within these traces; issue SPARQL queries; and visualize query results.

  6. Making proteomics data accessible and reusable: Current state of proteomics databases and repositories

    PubMed Central

    Perez-Riverol, Yasset; Alpi, Emanuele; Wang, Rui; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-01-01

    Compared to other data-intensive disciplines such as genomics, public deposition and storage of MS-based proteomics, data are still less developed due to, among other reasons, the inherent complexity of the data and the variety of data types and experimental workflows. In order to address this need, several public repositories for MS proteomics experiments have been developed, each with different purposes in mind. The most established resources are the Global Proteome Machine Database (GPMDB), PeptideAtlas, and the PRIDE database. Additionally, there are other useful (in many cases recently developed) resources such as ProteomicsDB, Mass Spectrometry Interactive Virtual Environment (MassIVE), Chorus, MaxQB, PeptideAtlas SRM Experiment Library (PASSEL), Model Organism Protein Expression Database (MOPED), and the Human Proteinpedia. In addition, the ProteomeXchange consortium has been recently developed to enable better integration of public repositories and the coordinated sharing of proteomics information, maximizing its benefit to the scientific community. Here, we will review each of the major proteomics resources independently and some tools that enable the integration, mining and reuse of the data. We will also discuss some of the major challenges and current pitfalls in the integration and sharing of the data. PMID:25158685

  7. Bio-repository of post-clinical test samples at the national cancer center hospital (NCCH) in Tokyo.

    PubMed

    Furuta, Koh; Yokozawa, Karin; Takada, Takako; Kato, Hoichi

    2009-08-01

    We established the Bio-repository at the National Cancer Center Hospital in October 2002. The main purpose of this article is to show the importance and usefulness of a bio-repository of post-clinical test samples not only for translational cancer research but also for routine clinical oncology by introducing the experience of setting up such a facility. Our basic concept of a post-clinical test sample is not as left-over waste, but rather as frozen evidence of a patient's pathological condition at a particular point. We can decode, if not all, most of the laboratory data from a post-clinical test sample. As a result, the bio-repository is able to provide not only the samples, but potentially all related laboratory data upon request. The areas of sample coverage are the following: sera after routine blood tests; sera after cross-match tests for transfusion; serum or plasma submitted at a patient's clinically important time period by the physician; and samples collected by the individual investigator. The formats of stored samples are plasma or serum, dried blood spot (DBS) and buffy coat. So far, 150 218 plasmas or sera, 35 253 DBS and 536 buffy coats have been registered for our bio-repository system. We arranged to provide samples to various concerned parties under strict legal and ethical agreements. Although the number of the utilized samples was initially limited, the inquiries for sample utilization are now increasing steadily from both research and clinical sources. Further efforts to increase the benefits of the repository are intended.

  8. Modelling of the reactive transport for rock salt-brine in geological repository systems based on improved thermodynamic database (Invited)

    NASA Astrophysics Data System (ADS)

    Müller, W.; Alkan, H.; Xie, M.; Moog, H.; Sonnenthal, E. L.

    2009-12-01

    The release and migration of toxic contaminants from the disposed wastes is one of the main issues in long-term safety assessment of geological repositories. In the engineered and geological barriers around the nuclear waste emplacements chemical interactions between the components of the system may affect the isolation properties considerably. As the chemical issues change the transport properties in the near and far field of a nuclear repository, modelling of the transport should also take the chemistry into account. The reactive transport modelling consists of two main components: a code that combines the possible chemical reactions with thermo-hydrogeological processes interactively and a thermodynamic databank supporting the required parameters for the calculation of the chemical reactions. In the last decade many thermo-hydrogeological codes were upgraded to include the modelling of the chemical processes. TOUGHREACT is one of these codes. This is an extension of the well known simulator TOUGH2 for modelling geoprocesses. The code is developed by LBNL (Lawrence Berkeley National Laboratory, Univ. of California) for the simulation of the multi-phase transport of gas and liquid in porous media including heat transfer. After the release of its first version in 1998, this code has been applied and improved many times in conjunction with considerations for nuclear waste emplacement. A recent version has been extended to calculate ion activities in concentrated salt solutions applying the Pitzer model. In TOUGHREACT, the incorporated equation of state module ECO2N is applied as the EOS module for non-isothermal multiphase flow in a fluid system of H2O-NaCl-CO2. The partitioning of H2O and CO2 between liquid and gas phases is modelled as a function of temperature, pressure, and salinity. This module is applicable for waste repositories being expected to generate or having originally CO2 in the fluid system. The enhanced TOUGHREACT uses an EQ3/6-formatted database for both Pitzer ion-interaction parameters and thermodynamic equilibrium constants. The reliability of the parameters is as important as the accuracy of the modelling tool. For this purpose the project THEREDA (www.thereda.de)was set up. The project aims at a comprehensive and internally consistent thermodynamic reference database for geochemical modelling of near and far-field processes occurring in repositories for radioactive wastes in various host rock formations. In the framework of the project all data necessary to perform thermodynamic equilibrium calculations for elevated temperature in the system of oceanic salts are under revision, and it is expected that related data will be available for download by 2010-03. In this paper the geochemical issues that can play an essential role for the transport of radioactive contaminants within and around waste repositories are discussed. Some generic calculations are given to illustrate the geochemical interactions and their probable effects on the transport properties around HLW emplacements and on CO2 generating and/or containing repository systems.

  9. International Approaches for Nuclear Waste Disposal in Geological Formations: Report on Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter

    2016-09-01

    The goal of the Fifth Worldwide Review is to document evolution in the state-of-the-art of approaches for nuclear waste disposal in geological formations since the Fourth Worldwide Review that was released in 2006. The last ten years since the previous Worldwide Review has seen major developments in a number of nations throughout the world pursuing geological disposal programs, both in preparing and reviewing safety cases for the operational and long-term safety of proposed and operating repositories. The countries that are approaching implementation of geological disposal will increasingly focus on the feasibility of safely constructing and operating their repositories in short-more » and long terms on the basis existing regulations. The WWR-5 will also address a number of specific technical issues in safety case development along with the interplay among stakeholder concerns, technical feasibility, engineering design issues, and operational and post-closure safety. Preparation and publication of the Fifth Worldwide Review on nuclear waste disposal facilitates assessing the lessons learned and developing future cooperation between the countries. The Report provides scientific and technical experiences on preparing for and developing scientific and technical bases for nuclear waste disposal in deep geologic repositories in terms of requirements, societal expectations and the adequacy of cases for long-term repository safety. The Chapters include potential issues that may arise as repository programs mature, and identify techniques that demonstrate the safety cases and aid in promoting and gaining societal confidence. The report will also be used to exchange experience with other fields of industry and technology, in which concepts similar to the design and safety cases are applied, as well to facilitate the public perception and understanding of the safety of the disposal approaches relative to risks that may increase over long times frames in the absence of a successful implementation of final dispositioning.« less

  10. Basaltic Dike Propagation at Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    Gaffney, E. S.; Damjanac, B.; Warpinski, N. R.

    2004-12-01

    We describe simulations of the propagation of basaltic dikes using a 2-dimensional, incompressible hydrofracture code including the effects of the free surface with specific application to potential interactions of rising magma with a nuclear waste repository at Yucca Mountain, Nevada. As the leading edge of the dike approaches the free surface, confinement at the crack tip is reduced and the tip accelerates relative to the magma front. In the absence of either excess confining stress or excess gas pressure in the tip cavity, this leads to an increase of crack-tip velocity by more than an order of magnitude. By casting the results in nondimensional form, they can be applied to a wide variety of intrusive situations. When applied to an alkali basalt intrusion at the proposed high-level nuclear waste repository at Yucca Mountain, the results provide for a description of the subsurface phenomena. For magma rising at 1 m/s and dikes wider than about 0.5 m, the tip of the fissure would already have breached the surface by the time magma arrived at the nominal 300-m repository depth. An approximation of the effect of magma expansion on dike propagation is used to show that removing the restriction of an incompressible magma would result in even greater crack-tip acceleration as the dike approached the surface. A second analysis with a distinct element code indicates that a dike could penetrate the repository even during the first 2000 years after closure during which time heating from radioactive decay of waste would raise the minimum horizontal compressive stress above the vertical stress for about 80 m above and below the repository horizon. Rather than sill formation, the analysis indicates that increased pressure and dike width below the repository cause the crack tip to penetrate the horizon, but much more slowly than under in situ stress conditions. The analysis did not address the effects of either anisotropic joints or heat loss on this result.

  11. Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tillerson, J.R.; Nimick, F.B.

    1984-12-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less

  12. Repository Drift Backfilling Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Londe, I.; Dubois, J.Ph.; Bauer, C.

    2008-07-01

    The 'Backfilling Demonstrator' is one of the technological demonstrators developed by ANDRA in the framework of the feasibility studies for a geological repository for high-level long-lived (HL-LL waste) within a clay formation. The demonstrator concerns the standard and supporting backfills as defined in Andra's 2005 design. The standard backfill is intended to fill up almost all drifts of the underground repository in order to limit any deformation of the rock after the degradation of the drift lining. The supporting backfill only concerns a small portion of the volume to be backfilled in order to counter the swelling pressure of themore » swelling clay contained in the sealing structures. The first objective of the demonstrator was to show the possibility of manufacturing a satisfactory backfill, in spite of the exiguity of the underground structures, and of reusing as much as possible the argillite muck. For the purpose of this experiment, the argillite muck was collected on Andra's work-site for the implementation of an underground research laboratory. Still ongoing, the second objective is to follow up the long-term evolution of the backfill. Approximately 200 m{sup 3} of compacted backfill material have been gathered in a large concrete tube simulating a repository drift. The standard backfill was manufactured exclusively with argillite. The supporting backfill was made by forming a mixture of argillite and sand. Operations were carried out mostly at Richwiller, close to Mulhouse, France. The objectives of the demonstrator were met: an application method was tested and proven satisfactory. The resulting dry densities are relatively high, although the moduli of deformation do not always reach the set goal. The selected objective for the demonstrator was a dry density corresponding to a relatively high compaction level (95% of the standard Proctor optimum [SPO]), for both pure argillite and the argillite-sand mixture. The plate-percussion compaction technique was used and proved satisfactory. The measured dry densities are higher than the 95%-SPO objective. The implementation rates remain very low due to the experimental conditions involved. The metal supply mode would need to be revised before any industrial application is contemplated. The Demonstrator Program started in August 2004 and is followed up today over the long term. With that objective in mind, sensors and a water-saturation system have been installed. (author)« less

  13. Long-Term Modeling of Coupled Processes in a Generic Salt Repository for Heat-Generating Nuclear Waste: Analysis of the Impacts of Halite Solubility Constraints

    NASA Astrophysics Data System (ADS)

    Blanco Martin, L.; Rutqvist, J.; Battistelli, A.; Birkholzer, J. T.

    2015-12-01

    Rock salt is a potential medium for the underground disposal of nuclear waste because it has several assets, such as its ability to creep and heal fractures and its water and gas tightness in the undisturbed state. In this research, we focus on disposal of heat-generating nuclear waste and we consider a generic salt repository with in-drift emplacement of waste packages and crushed salt backfill. As the natural salt creeps, the crushed salt backfill gets progressively compacted and an engineered barrier system is subsequently created [1]. The safety requirements for such a repository impose that long time scales be considered, during which the integrity of the natural and engineered barriers have to be demonstrated. In order to evaluate this long-term integrity, we perform numerical modeling based on state-of-the-art knowledge. Here, we analyze the impacts of halite dissolution and precipitation within the backfill and the host rock. For this purpose, we use an enhanced equation-of-state module of TOUGH2 that properly includes temperature-dependent solubility constraints [2]. We perform coupled thermal-hydraulic-mechanical modeling and we investigate the influence of the mentioned impacts. The TOUGH-FLAC simulator, adapted for large strains and creep, is used [3]. In order to quantify the importance of salt dissolution and precipitation on the effective porosity, permeability, pore pressure, temperature and stress field, we compare numerical results that include or disregard fluids of variable salinity. The sensitivity of the results to some parameters, such as the initial saturation within the backfill, is also addressed. References: [1] Bechthold, W. et al. Backfilling and Sealing of Underground Repositories for Radioactive Waste in Salt (BAMBUS II Project). Report EUR20621 EN: European Atomic Energy Community, 2004. [2] Battistelli A. Improving the treatment of saline brines in EWASG for the simulation of hydrothermal systems. Proceedings, TOUGH Symposium 2012, Lawrence Berkeley National Laboratory, Berkeley, California, Sept. 17-19, 2012. [3] Blanco-Martín L, Rutqvist J, Birkholzer JT. Long-term modelling of the thermal-hydraulic-mechanical response of a generic salt repository for heat generating nuclear waste. Eng Geol 2015;193:198-211. doi:10.1016/j.enggeo.2015.04.014.

  14. Performance assessments of nuclear waste repositories--A dialogue on their value and limitations

    USGS Publications Warehouse

    Ewing, Rodney C.; Tierney, Martin S.; Konikow, Leonard F.; Rechard, Rob P.

    1999-01-01

    Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, sixstep process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA, they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.

  15. The Fukushima Daiichi Accident Study Information Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shawn St. Germain; Curtis Smith; David Schwieder

    This paper presents a description of The Fukushima Daiichi Accident Study Information Portal. The Information Portal was created by the Idaho National Laboratory as part of joint NRC and DOE project to assess the severe accident modeling capability of the MELCOR analysis code. The Fukushima Daiichi Accident Study Information Portal was created to collect, store, retrieve and validate information and data for use in reconstructing the Fukushima Daiichi accident. In addition to supporting the MELCOR simulations, the Portal will be the main DOE repository for all data, studies and reports related to the accident at the Fukushima Daiichi nuclear powermore » station. The data is stored in a secured (password protected and encrypted) repository that is searchable and accessible to researchers at diverse locations.« less

  16. myExperiment: a repository and social network for the sharing of bioinformatics workflows

    PubMed Central

    Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-01-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605

  17. Assessment of the long-term durability of concrete in radioactive waste repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, A.; Goult, D.J.; Hearne, J.A.

    1986-01-01

    A preliminary assessment of the long-term durability of concrete in a repository sited in clay is presented. The assessment is based on recorded experience of concrete structures and both field and laboratory studies. It is also supported by results of the examination of a concrete sample which had been buried in clay for 43 years. The engineering lifetime of a 1 m thick reinforced concrete slab, with one face in contact with clay, and the way in which pH in the repository as a whole is likely to vary with time have both been estimated from available data. The estimatesmore » indicate that engineering lifetimes of about 10/sup 3/ years are expected (providing that sulfate resisting cement is used) and that pH is likely to remain above 10.5 for about 10/sup 6/ years.« less

  18. Use of Groundwater Lifetime Expectancy for the Performance Assessment of Deep Geologic Radioactive Waste Repositories.

    NASA Astrophysics Data System (ADS)

    Cornaton, F.; Park, Y.; Normani, S.; Sudicky, E.; Sykes, J.

    2005-12-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, the safety of the host repository depends on two main barriers: the engineered barrier and the natural geological barrier. If radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from the repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. In a second step, the risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The proposed methodology is applied in the context of a typical Canadian Shield environment. Based on a statistically-generated three-dimension network of fracture zones embedded in the granitic host rock, the sensitivity and the uncertainty of lifetime expectancy to the hydraulic and dispersive properties of the fracture network, including the impact of conditioning via their surface expressions, is computed in order to demonstrate the utility of the methodology.

  19. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Some of the major technical questions associated with the burial of radioactive high-level wastes in geologic formations are related to the thermal environments generated by the waste and the impact of this dissipated heat on the surrounding environment. The design of a high level waste storage facility must be such that the temperature variations that occur do not adversely affect operating personnel and equipment. The objective of this investigation was to assist OWI by determining the thermal environment that would be experienced by personnel and equipment in a waste storage facility in salt. Particular emphasis was placed on determining themore » maximum floor and air temperatures with and without ventilation in the first 30 years after waste emplacement. The assumed facility design differs somewhat from those previously analyzed and reported, but many of the previous parametric surveys are useful for comparison. In this investigation a number of 2-dimensional and 3-dimensional simulations of the heat flow in a repository have been performed on the HEATING5 and TRUMP heat transfer codes. The representative repository constructs used in the simulations are described, as well as the computational models and computer codes. Results of the simulations are presented and discussed. Comparisons are made between the recent results and those from previous analyses. Finally, a summary of study limitations, comparisons, and conclusions is given.« less

  1. Push and pull models to manage patient consent and licensing of multimedia resources in digital repositories for case-based reasoning.

    PubMed

    Kononowicz, Andrzej A; Zary, Nabil; Davies, David; Heid, Jörn; Woodham, Luke; Hege, Inga

    2011-01-01

    Patient consents for distribution of multimedia constitute a significant element of medical case-based repositories in medicine. A technical challenge is posed by the right of patients to withdraw permission to disseminate their images or videos. A technical mechanism for spreading information about changes in multimedia usage licenses is sought. The authors gained their experience by developing and managing a large (>340 cases) repository of virtual patients within the European project eViP. The solution for dissemination of license status should reuse and extend existing metadata standards in medical education. Two methods: PUSH and PULL are described differing in the moment of update and the division of responsibilities between parties in the learning object exchange process. The authors recommend usage of the PUSH scenario because it is better adapted to legal requirements in many countries. It needs to be stressed that the solution is based on mutual trust of the exchange partners and therefore is most appropriate for use in educational alliances and consortia. It is hoped that the proposed models for exchanging consents and licensing information will become a crucial part of the technical frameworks for building case-based repositories.

  2. SNOMED CT module-driven clinical archetype management.

    PubMed

    Allones, J L; Taboada, M; Martinez, D; Lozano, R; Sobrido, M J

    2013-06-01

    To explore semantic search to improve management and user navigation in clinical archetype repositories. In order to support semantic searches across archetypes, an automated method based on SNOMED CT modularization is implemented to transform clinical archetypes into SNOMED CT extracts. Concurrently, query terms are converted into SNOMED CT concepts using the search engine Lucene. Retrieval is then carried out by matching query concepts with the corresponding SNOMED CT segments. A test collection of the 16 clinical archetypes, including over 250 terms, and a subset of 55 clinical terms from two medical dictionaries, MediLexicon and MedlinePlus, were used to test our method. The keyword-based service supported by the OpenEHR repository offered us a benchmark to evaluate the enhancement of performance. In total, our approach reached 97.4% precision and 69.1% recall, providing a substantial improvement of recall (more than 70%) compared to the benchmark. Exploiting medical domain knowledge from ontologies such as SNOMED CT may overcome some limitations of the keyword-based systems and thus improve the search experience of repository users. An automated approach based on ontology segmentation is an efficient and feasible way for supporting modeling, management and user navigation in clinical archetype repositories. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    The modeling efforts in support of the field test planning conducted at LBNL leverage on recent developments of tools for modeling coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate and transport of water. These are modeling capabilities that will be suitable for assisting in the design of field experiment, especially related to multiphase flow processes coupled with mechanical deformations, at high temperature. In this report,more » we first examine previous generic repository modeling results, focusing on the first 20 years to investigate the expected evolution of the different processes that could be monitored in a full-scale heater experiment, and then present new results from ongoing modeling of the Thermal Simulation for Drift Emplacement (TSDE) experiment, a heater experiment on the in-drift emplacement concept at the Asse Mine, Germany, and provide an update on the ongoing model developments for modeling brine migration. LBNL also supported field test planning activities via contributions to and technical review of framework documents and test plans, as well as participation in workshops associated with field test planning.« less

  4. A large scale GIS geodatabase of soil parameters supporting the modeling of conservation practice alternatives in the United States

    USDA-ARS?s Scientific Manuscript database

    Water quality modeling requires across-scale support of combined digital soil elements and simulation parameters. This paper presents the unprecedented development of a large spatial scale (1:250,000) ArcGIS geodatabase coverage designed as a functional repository of soil-parameters for modeling an...

  5. Ligandbook: an online repository for small and drug-like molecule force field parameters.

    PubMed

    Domanski, Jan; Beckstein, Oliver; Iorga, Bogdan I

    2017-06-01

    Ligandbook is a public database and archive for force field parameters of small and drug-like molecules. It is a repository for parameter sets that are part of published work but are not easily available to the community otherwise. Parameter sets can be downloaded and immediately used in molecular dynamics simulations. The sets of parameters are versioned with full histories and carry unique identifiers to facilitate reproducible research. Text-based search on rich metadata and chemical substructure search allow precise identification of desired compounds or functional groups. Ligandbook enables the rapid set up of reproducible molecular dynamics simulations of ligands and protein-ligand complexes. Ligandbook is available online at https://ligandbook.org and supports all modern browsers. Parameters can be searched and downloaded without registration, including access through a programmatic RESTful API. Deposition of files requires free user registration. Ligandbook is implemented in the PHP Symfony2 framework with TCL scripts using the CACTVS toolkit. oliver.beckstein@asu.edu or bogdan.iorga@cnrs.fr ; contact@ligandbook.org . Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  6. Influence of transitional volcanic strata on lateral diversion at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Flint, Lorraine E.; Flint, Alan L.; Selker, John S.

    2003-01-01

    Natural hydraulic barriers exist at Yucca Mountain, Nevada, a potential high‐level nuclear waste repository, that have been identified as possible lateral diversions for reducing deep percolation through the waste storage area. Historical development of the conceptual model of lateral diversion has been limited by available field data, but numerical investigations presented the possibility of significant lateral diversion due to the presence of a thin, porous rock layer, the Paintbrush nonwelded tuffs. Analytical analyses of the influence of transitional changes in properties suggest that minimal lateral diversion is likely at Yucca Mountain. Numerical models, to this point, have not accounted for the gradual transition of properties or the existence of multiple layers that could inadvertently influence the simulation of lateral diversion as an artifact of numerical model discretization. Analyses were made of subsurface matric potential measurements, and comparisons were made of surface infiltration estimates with deeper percolation flux calculations using chloride‐mass‐balance calculations and simulations of measured temperature profiles. These analyses suggest that insignificant lateral diversion has occurred above the repository horizon and that water generally moves vertically through the Paintbrush nonwelded tuffs.

  7. Analogues to features and processes of a high-level radioactive waste repository proposed for Yucca Mountain, Nevada

    USGS Publications Warehouse

    Simmons, Ardyth M.; Stuckless, John S.; with a Foreword by Abraham Van Luik, U.S. Department of Energy

    2010-01-01

    Natural analogues are defined for this report as naturally occurring or anthropogenic systems in which processes similar to those expected to occur in a nuclear waste repository are thought to have taken place over time periods of decades to millennia and on spatial scales as much as tens of kilometers. Analogues provide an important temporal and spatial dimension that cannot be tested by laboratory or field-scale experiments. Analogues provide one of the multiple lines of evidence intended to increase confidence in the safe geologic disposal of high-level radioactive waste. Although the work in this report was completed specifically for Yucca Mountain, Nevada, as the proposed geologic repository for high-level radioactive waste under the U.S. Nuclear Waste Policy Act, the applicability of the science, analyses, and interpretations is not limited to a specific site. Natural and anthropogenic analogues have provided and can continue to provide value in understanding features and processes of importance across a wide variety of topics in addressing the challenges of geologic isolation of radioactive waste and also as a contribution to scientific investigations unrelated to waste disposal. Isolation of radioactive waste at a mined geologic repository would be through a combination of natural features and engineered barriers. In this report we examine analogues to many of the various components of the Yucca Mountain system, including the preservation of materials in unsaturated environments, flow of water through unsaturated volcanic tuff, seepage into repository drifts, repository drift stability, stability and alteration of waste forms and components of the engineered barrier system, and transport of radionuclides through unsaturated and saturated rock zones.

  8. PGOPHER in the Classroom and the Laboratory

    NASA Astrophysics Data System (ADS)

    Western, Colin

    2015-06-01

    PGOPHER is a general purpose program for simulating and fitting rotational, vibrational and electronic spectra. As it uses a graphical user interface the basic operation is sufficiently straightforward to make it suitable for use in undergraduate practicals and computer based classes. This talk will present two experiments that have been in regular use by Bristol undergraduates for some years based on the analysis of infra-red spectra of cigarette smoke and, for more advanced students, visible and near ultra-violet spectra of a nitrogen discharge and a hydrocarbon flame. For all of these the rotational structure is analysed and used to explore ideas of bonding. The talk will discuss the requirements for the apparatus and the support required. Other ideas for other possible experiments and computer based exercises will also be presented, including a group exercise. The PGOPHER program is open source, and is available for Microsoft Windows, Apple Mac and Linux. It can be freely downloaded from the supporting website http://pgopher.chm.bris.ac.uk. The program does not require any installation process, so can be run on student's own machines or easily setup on classroom or laboratory computers. PGOPHER, a Program for Simulating Rotational, Vibrational and Electronic Structure, C. M. Western, University of Bristol, http://pgopher.chm.bris.ac.uk PGOPHER version 8.0, C M Western, 2014, University of Bristol Research Data Repository, doi:10.5523/bris.huflggvpcuc1zvliqed497r2

  9. Adsorption of gluconate and uranyl on C-S-H phases: Combination of wet chemistry experiments and molecular dynamics simulations for the binary systems

    NASA Astrophysics Data System (ADS)

    Androniuk, Iuliia; Landesman, Catherine; Henocq, Pierre; Kalinichev, Andrey G.

    2017-06-01

    As a first step in developing better molecular scale understanding of the effects of organic additives on the adsorption and mobility of radionuclides in cement under conditions of geological nuclear waste repositories, two complementary approaches, wet chemistry experiments and molecular dynamics (MD) computer simulations, were applied to study the sorption behaviour of two simple model systems: gluconate and uranyl on calcium silicate hydrate phases (C-S-H) - the principal mineral component of hardened cement paste (HCP). Experimental data on sorption and desorption kinetics and isotherms of adsorption for gluconate/C-S-H and U(VI)/C-S-H binary systems were collected and quantitatively analysed for C-S-H samples synthesised with various Ca/Si ratios (0.83, 1.0, 1.4) corresponding to various stages of HCP aging and degradation. Gluconate labelled with 14C isotope was used in order to improve the sensitivity of analytical detection technique (LSC) at particularly low concentrations (10-8-10-5 mol/L). There is a noticeable effect of Ca/Si ratio on the gluconate sorption on C-S-H, with stronger sorption at higher Ca/Si ratios. Sorption of organic anions on C-S-H is mediated by the presence of Ca2+ at the interface and strongly depends on the surface charge and Ca2+ concentration. In parallel, classical MD simulations of the same model systems were performed in order to identify specific surface sorption sites most actively involved in the sorption of gluconate and uranyl on C-S-H and to clarify molecular mechanisms of adsorption.

  10. Determination of In-situ Porosity and Investigation of Diffusion Processes at the Grimsel Test Site, Switzerland.

    NASA Astrophysics Data System (ADS)

    Biggin, C.; Ota, K.; Siittari-Kauppi, M.; Moeri, A.

    2004-12-01

    In the context of a repository for radioactive waste, 'matrix diffusion' is used to describe the process by which solute, flowing in distinct flow paths, penetrates the surrounding rock matrix. Diffusion into the matrix occurs in a connected system of pores or microfractures. Matrix diffusion provides a mechanism for greatly enlarging the area of rock surface in contact with advecting radionuclides, from that of the flow path surfaces (and infills), to a much larger portion of the bulk rock and increases the global pore volume which can retard radionuclides. In terms of a repository safety assessment, demonstration of a significant depth of diffusion-accessible pore space may result in a significant delay in the calculated release of any escaping radionuclides to the environment and a dramatic reduction in the resulting concentration released into the biosphere. For the last decade, Nagra has investigated in situ matrix diffusion at the Grimsel Test Site (GTS) in the Swiss Alps. The in situ investigations offer two distinct advantages to those performed in the lab, namely: 1. Lab-based determination of porosity and diffusivity can lead to an overestimation of matrix diffusion due to stress relief when the rock is sampled (which would overestimate the retardation in the geosphere) 2. Lab-based analysis usually examines small (cm scale) samples and cannot therefore account for any matrix heterogeneity over the hundreds or thousands of metres a typical flow path The in situ investigations described began with the Connected Porosity project, wherein a specially developed acrylic resin was injected into the rock matrix to fill the pore space and determine the depth of connected porosity. The resin was polymerised in situ and the entire rock mass removed by overcoring. The results indicated that lab-based porosity measurements may be two to three times higher than those obtained in situ. While the depth of accessible matrix from a water-conducting feature assumed in repository performance assessments is generally 1 to 10 cm, the results from the GTS in situ experiment suggested depths of several metres could be more appropriate. More recently, the Pore Space Geometry (PSG) experiment at the GTS has used a C-14 doped acrylic resin, combined with state-of-the-art digital beta autoradiography and fluorescence detection to examine a larger area of rock for determination of porosity and the degree of connected pore space. Analysis is currently ongoing and the key findings will be reported in this paper. Starting at the GTS in 2005, the Long-term Diffusion (LTD) project will investigate such processes over spatial and temporal scales more relevant to a repository than traditional lab-based experiments. In the framework of this experiment, long-term (10 to 50 years) in situ diffusion experiments and resin injection experiments are planned to verify current models for matrix diffusion as a radionuclide retardation process. This paper will discuss the findings of the first two experiments and their significance to repository safety assessments before discussing the strategy for the future in relation to the LTD project.

  11. Pretest characterization of WIPP experimental waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J.; Davis, H.; Drez, P.E.

    The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is an underground repository designed for the storage and disposal of transuranic (TRU) wastes from US Department of Energy (DOE) facilities across the country. The Performance Assessment (PA) studies for WIPP address compliance of the repository with applicable regulations, and include full-scale experiments to be performed at the WIPP site. These experiments are the bin-scale and alcove tests to be conducted by Sandia National Laboratories (SNL). Prior to conducting these experiments, the waste to be used in these tests needs to be characterized to provide data on the initial conditionsmore » for these experiments. This characterization is referred to as the Pretest Characterization of WIPP Experimental Waste, and is also expected to provide input to other programmatic efforts related to waste characterization. The purpose of this paper is to describe the pretest waste characterization activities currently in progress for the WIPP bin-scale waste, and to discuss the program plan and specific analytical protocols being developed for this characterization. The relationship between different programs and documents related to waste characterization efforts is also highlighted in this paper.« less

  12. Effects of microbial processes on gas generation under expected WIPP repository conditions: Annual report through 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francis, A.J.; Gillow, J.B.

    1993-09-01

    Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term (< 6 months) and long-term (> 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and undergroundmore » workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation.« less

  13. Geologic uncertainty in a regulatory environment: An example from the potential Yucca Mountain nuclear waste repository site

    NASA Astrophysics Data System (ADS)

    Rautman, C. A.; Treadway, A. H.

    1991-11-01

    Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.

  14. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  15. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  16. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  18. Frictional Properties of Opalinus Clay: Implications for Nuclear Waste Storage

    NASA Astrophysics Data System (ADS)

    Orellana, L. F.; Scuderi, M. M.; Collettini, C.; Violay, M.

    2018-01-01

    The kaolinite-bearing Opalinus Clay (OPA) is the host rock proposed in Switzerland for disposal of radioactive waste. However, the presence of tectonic faults intersecting the OPA formation put the long-term safety performance of the underground repository into question due to the possibility of earthquakes triggered by fault instability. In this paper, we study the frictional properties of the OPA shale. To do that, we have carried out biaxial direct shear experiments under conditions typical of nuclear waste storage. We have performed velocity steps (1-300 μm/s) and slide-hold-slide tests (1-3,000 s) on simulated fault gouge at different normal stresses (4-30 MPa). To establish the deformation mechanisms, we have analyzed the microstructures of the sheared samples through scanning electron microscopy. Our results show that peak (μpeak) and steady state friction (μss) range from 0.21 to 0.52 and 0.14 to 0.39, respectively, thus suggesting that OPA fault gouges are weak. The velocity dependence of friction indicates a velocity strengthening regime, with the friction rate parameter (a - b) that decreases with normal stress. Finally, the zero healing values imply a lack of restrengthening during interseismic periods. Taken together, if OPA fault reactivates, our experimental evidence favors an aseismic slip behavior, making the nucleation of earthquakes difficult, and long-term weakness, resulting in stable fault creeping over geological times. Based on the results, our study confirms the seismic safety of the OPA formation for a nuclear waste repository.

  19. A Global Repository for Planet-Sized Experiments and Observations

    NASA Technical Reports Server (NTRS)

    Williams, Dean; Balaji, V.; Cinquini, Luca; Denvil, Sebastien; Duffy, Daniel; Evans, Ben; Ferraro, Robert D.; Hansen, Rose; Lautenschlager, Michael; Trenham, Claire

    2016-01-01

    Working across U.S. federal agencies, international agencies, and multiple worldwide data centers, and spanning seven international network organizations, the Earth System Grid Federation (ESGF) allows users to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP) output used by the Intergovernmental Panel on Climate Change assessment reports. Data served by ESGF not only include model output (i.e., CMIP simulation runs) but also include observational data from satellites and instruments, reanalyses, and generated images. Metadata summarize basic information about the data for fast and easy data discovery.

  20. Making proteomics data accessible and reusable: current state of proteomics databases and repositories.

    PubMed

    Perez-Riverol, Yasset; Alpi, Emanuele; Wang, Rui; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-03-01

    Compared to other data-intensive disciplines such as genomics, public deposition and storage of MS-based proteomics, data are still less developed due to, among other reasons, the inherent complexity of the data and the variety of data types and experimental workflows. In order to address this need, several public repositories for MS proteomics experiments have been developed, each with different purposes in mind. The most established resources are the Global Proteome Machine Database (GPMDB), PeptideAtlas, and the PRIDE database. Additionally, there are other useful (in many cases recently developed) resources such as ProteomicsDB, Mass Spectrometry Interactive Virtual Environment (MassIVE), Chorus, MaxQB, PeptideAtlas SRM Experiment Library (PASSEL), Model Organism Protein Expression Database (MOPED), and the Human Proteinpedia. In addition, the ProteomeXchange consortium has been recently developed to enable better integration of public repositories and the coordinated sharing of proteomics information, maximizing its benefit to the scientific community. Here, we will review each of the major proteomics resources independently and some tools that enable the integration, mining and reuse of the data. We will also discuss some of the major challenges and current pitfalls in the integration and sharing of the data. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Reactive transport simulations of alternative flow pathways in the ambient unsaturated zone at Yucca Mountain, Nevada

    NASA Astrophysics Data System (ADS)

    Browning, L.; Murphy, W.; Manepally, C.; Fedors, R.

    2003-04-01

    Uncertainties in simulated ambient system unsaturated zone flow could have a significant impact on performance evaluations of the proposed nuclear waste repository at Yucca Mountain, Nevada. In addition to determining variations in the quantity of water available to corrode engineered materials and transport radionuclides, model assumptions regarding flow pathways may significantly affect estimates of groundwater chemistry. The manner and extent to which groundwater compositions evolve along a flow pathway are determined mainly by thermohydrologic conditions, the types of reactive materials encountered, and the interaction times with those materials. Simulated groundwater compositions can thus vary significantly depending on whether or not the flow model includes lateral diversion of infiltrating waters, or preferential flow pathways in variably-saturated materials. To assist a regulatory review of a potential license application for a geologic repository for high-level waste, we developed a reactive transport model for the ambient hydrogeochemical system at Yucca Mountain. The model simulates two phase, nonisothermal, advective and diffusive flow and transport through a one dimensional, matrix and fracture continua (dual permeability) containing ten kinetically reactive hydrostatigraphic layers in the vicinity of the SD-9 borehole at Yucca Mountain. In this presentation, we describe how the model was used to evaluate alternative ambient unsaturated zone flow pathways proposed by the U.S. Department of Energy. This abstract is an independent product of the CNWRA and does not necessarily reflect the views or regulatory position of the NRC.

  2. Control of stacking loads in final waste disposal according to the borehole technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuser, W.; Barnert, E.; Vijgen, H.

    1996-12-01

    The semihydrostatic model has been developed in order to assess the mechanical toads acting on heat-generating ILW(Q) and HTGR fuel element waste packages to be emplaced in vertical boreholes according to the borehole technique in underground rock salt formations. For the experimental validation of the theory, laboratory test stands reduced in scale are set up to simulate the bottom section of a repository borehole. A comparison of the measurement results with the data computed by the model, a correlation between the test stand results, and a systematic determination of material-typical crushed salt parameters in a separate research project will servemore » to derive a set of characteristic equations enabling a description of real conditions in a future repository.« less

  3. Evaluation of the predictive capability of coupled thermo-hydro-mechanical models for a heated bentonite/clay system (HE-E) in the Mont Terri Rock Laboratory

    DOE PAGES

    Garitte, B.; Shao, H.; Wang, X. R.; ...

    2017-01-09

    Process understanding and parameter identification using numerical methods based on experimental findings are a key aspect of the international cooperative project DECOVALEX. Comparing the predictions from numerical models against experimental results increases confidence in the site selection and site evaluation process for a radioactive waste repository in deep geological formations. In the present phase of the project, DECOVALEX-2015, eight research teams have developed and applied models for simulating an in-situ heater experiment HE-E in the Opalinus Clay in the Mont Terri Rock Laboratory in Switzerland. The modelling task was divided into two study stages, related to prediction and interpretation ofmore » the experiment. A blind prediction of the HE-E experiment was performed based on calibrated parameter values for both the Opalinus Clay, that were based on the modelling of another in-situ experiment (HE-D), and modelling of laboratory column experiments on MX80 granular bentonite and a sand/bentonite mixture .. After publication of the experimental data, additional coupling functions were analysed and considered in the different models. Moreover, parameter values were varied to interpret the measured temperature, relative humidity and pore pressure evolution. The analysis of the predictive and interpretative results reveals the current state of understanding and predictability of coupled THM behaviours associated with geologic nuclear waste disposal in clay formations.« less

  4. Evaluation of the predictive capability of coupled thermo-hydro-mechanical models for a heated bentonite/clay system (HE-E) in the Mont Terri Rock Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garitte, B.; Shao, H.; Wang, X. R.

    Process understanding and parameter identification using numerical methods based on experimental findings are a key aspect of the international cooperative project DECOVALEX. Comparing the predictions from numerical models against experimental results increases confidence in the site selection and site evaluation process for a radioactive waste repository in deep geological formations. In the present phase of the project, DECOVALEX-2015, eight research teams have developed and applied models for simulating an in-situ heater experiment HE-E in the Opalinus Clay in the Mont Terri Rock Laboratory in Switzerland. The modelling task was divided into two study stages, related to prediction and interpretation ofmore » the experiment. A blind prediction of the HE-E experiment was performed based on calibrated parameter values for both the Opalinus Clay, that were based on the modelling of another in-situ experiment (HE-D), and modelling of laboratory column experiments on MX80 granular bentonite and a sand/bentonite mixture .. After publication of the experimental data, additional coupling functions were analysed and considered in the different models. Moreover, parameter values were varied to interpret the measured temperature, relative humidity and pore pressure evolution. The analysis of the predictive and interpretative results reveals the current state of understanding and predictability of coupled THM behaviours associated with geologic nuclear waste disposal in clay formations.« less

  5. Excavation Induced Hydraulic Response of Opalinus Clay - Investigations of the FE-Experiment at the Mont Terri URL in Switzerland

    NASA Astrophysics Data System (ADS)

    Vogt, T.; Müller, H. R.; Garitte, B.; Sakaki, T.; Vietor, T.

    2013-12-01

    The Full-Scale Emplacement (FE) Experiment at the Mont Terri underground research laboratory in Switzerland is a full-scale heater test in a clay-rich formation (Opalinus Clay). Based on the Swiss disposal concept it simulates the construction, emplacement, backfilling, and post-closure thermo-hydro-mechanical (THM) evolution of a spent fuel / vitrified high-level waste (SF / HLW) repository tunnel in a realistic manner. The main aim of this experiment is to investigate SF / HLW repository-induced THM coupled effects mainly in the host rock but also in the engineered barrier system (EBS), which consists of bentonite pellets and blocks. A further aim is to gather experience with full-scale tunnel construction and associated hydro-mechanical (HM) processes in the host rock. The entire experiment implementation (in a 50 m long gallery with approx. 3 m diameter) as well as the post-closure THM evolution will be monitored using a network of several hundred sensors (state-of-the-art sensors and measurement systems as well as fiber-optic sensors). The sensors are distributed in the host rock's near- and far-field, the tunnel lining, the EBS, and on the heaters. The heater emplacement and backfilling has not started yet, therefore only the host rock instrumentation is installed at the moment and is currently generating data. We will present the instrumentation concept and rationale as well as the first monitoring results of the excavation and ventilation phase. In particular, we investigated the excavation induced hydraulic response of the host rock. Therefore, the spatiotemporal evolution of porewater-pressure time series was analyzed to get a better understanding of HM coupled processes during and after the excavation phase as well as the impact of anisotropic geomechanic and hydraulic properties of the clay-rich formation on its hydraulic behavior. Excavation related investigations were completed by means of inclinometer data to characterize the non-elastic and time-dependent deformations. In addition, we evaluated the effect of drainage and suction processes during the ventilation phase on the pressure distribution in the host rock. Based on our results the conceptual models of HM processes and hydraulic behavior of clay rich formations during excavation and ventilation phases could be improved.

  6. Data Sharing and the Development of the Cleveland Clinic Statistical Education Dataset Repository

    ERIC Educational Resources Information Center

    Nowacki, Amy S.

    2013-01-01

    Examples are highly sought by both students and teachers. This is particularly true as many statistical instructors aim to engage their students and increase active participation. While simulated datasets are functional, they lack real perspective and the intricacies of actual data. In order to obtain real datasets, the principal investigator of a…

  7. Ciênsação: Gaining a Feeling for Sciences

    ERIC Educational Resources Information Center

    de Oliveira, Marcos Henrique Abreu; Fischer, Robert

    2017-01-01

    Ciênsação, an open online repository for hands-on experiments, has been developed to convince teachers in Latin America that science is best experienced first hand. Permitting students to experiment autonomously in small groups can be a challenging endeavour for educators in these countries. We analyse the reasons that cause hesitation of teachers…

  8. Monitoring water content in Opalinus Clay within the FE-Experiment: Test application of dielectric water content sensors

    NASA Astrophysics Data System (ADS)

    Sakaki, T.; Vogt, T.; Komatsu, M.; Müller, H. R.

    2013-12-01

    The spatiotemporal variation of water content in the near field rock around repository tunnels for radioactive waste in clay formations is one of the essential quantities to be monitored for safety assessment in many waste disposal programs. Reliable measurements of water content are important not only for the understanding and prediction of coupled hydraulic-mechanic processes that occur during tunnel construction and ventilation phase, but also for the understanding of coupled thermal-hydraulic-mechanical (THM) processes that take place in the host rock during the post closure phase of a repository tunnel for spent fuel and high level radioactive waste (SF/HLW). The host rock of the Swiss disposal concept for SF/HLW is the Opalinus Clay formation (age of approx. 175 Million years). To better understand the THM effects in a full-scale heater-engineered barrier-rock system in Opalinus Clay, a full-scale heater test, namely the Full-Scale Emplacement (FE) experiment, was initiated in 2010 at the Mont Terri underground rock laboratory in north-western Switzerland. The experiment is designed to simulate the THM evolution of a SF/HLW repository tunnel based on the Swiss disposal concept in a realistic manner during the construction, emplacement, backfilling, and post-closure phases. The entire experiment implementation (in a 50 m long gallery with approx. 3 m diameter) as well as the post-closure THM evolution will be monitored using a network of several hundred sensors. The sensors will be distributed in the host rock, the tunnel lining, the engineered barrier, which consists of bentonite pellets and blocks, and on the heaters. The excavation is completed and the tunnel is currently being ventilated. Measuring water content in partially saturated clay-rich high-salinity rock with a deformable grain skeleton is challenging. Therefore, we use the ventilation phase (before backfilling and heating) to examine the applicability of commercial water content sensors and to design custom-made TDR sensors. The focus of this study is mainly on dielectric-based commercial water content sensors. Unlike soils for which the sensors were originally designed, it requires significantly more attention to properly install it onto rock (i.e., a good contact with the sensor and rock). The results will be used to select and design the instrumentation set-up for monitoring water content during the heating phase where sensors have to withstand harsh conditions (high salinity, high temperature, high pressures, high clay content and long term monitoring up to 10 years). The sensor tests are beneficial also in the sense that the water content data generated during these tests provide insights into drainage processes after tunnel construction and seasonal water content variations in the near field rock around the test gallery. We will present results from the tests and measurements performed during the first year.

  9. XRD Analysis of Cement Paste Samples Exposed to the Simulated Environment of a Deep Repository - 12239

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira, Eduardo G.A.; Marumo, Julio T.; Vicente, Roberto

    2012-07-01

    Portland cement materials are widely used as engineered barriers in repositories for radioactive waste. The capacity of such barriers to avoid the disposed of radionuclides to entering the biosphere in the long-term depends on the service life of those materials. Thus, the performance assessment of structural materials under a series of environmental conditions prevailing at the environs of repositories is a matter of interest. The durability of cement paste foreseen as backfill in a deep borehole for disposal of disused sealed radioactive sources is investigated in the development of the repository concept. Results are intended to be part of themore » body of evidence in the safety case of the proposed disposal technology. This paper presents the results of X-Ray Diffraction (XRD) Analysis of cement paste exposed to varying temperatures and simulated groundwater after samples received the radiation dose that the cement paste will accumulate until complete decay of the radioactive sources. The XRD analysis of cement paste samples realized in this work allowed observing some differences in the results of cement paste specimens that were submitted to different treatments. The cluster analysis of results was able to group tested samples according to the applied treatments. Mineralogical differences, however, are tenuous and, apart from ettringite, are hardly observed. The absence of ettringite in all the seven specimens that were kept in dry storage at high temperature had hardly occurred by natural variations in the composition of hydrated cement paste because ettringite is observed in all tested except the seven specimens. Therefore this absence is certainly the result of the treatments and could be explained by the decomposition of ettringite. Although the temperature of decomposition is about 110-120 deg. C, it may be initially decomposed to meta-ettringite, an amorphous compound, above 50 deg. C in the absence of water. Influence of irradiation on the mineralogical composition was not observed when the treatment was analyzed individually or when analyzed under the possible synergic effect with other treatments. However, the radiation dose to which specimens were exposed is only a fraction of the accumulated dose in cement paste until complete decay of some sources. Therefore, in the short term, the conditions deemed to prevail in the repository environment may not influence the properties of cement paste at detectable levels. Under the conditions presented in this work, it is not possible to predict the long term evolution of these properties. (authors)« less

  10. Role of natural analogs in performance assessment of nuclear waste repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sagar, B.; Wittmeyer, G.W.

    1995-09-01

    Mathematical models of the flow of water and transport of radionuclides in porous media will be used to assess the ability of deep geologic repositories to safely contain nuclear waste. These models must, in some sense, be validated to ensure that they adequately describe the physical processes occurring within the repository and its geologic setting. Inasmuch as the spatial and temporal scales over which these models must be applied in performance assessment are very large, validation of these models against laboratory and small-scale field experiments may be considered inadequate. Natural analogs may provide validation data that are representative of physico-chemicalmore » processes that occur over spatial and temporal scales as large or larger than those relevant to repository design. The authors discuss the manner in which natural analog data may be used to increase confidence in performance assessment models and conclude that, while these data may be suitable for testing the basic laws governing flow and transport, there is insufficient control of boundary and initial conditions and forcing functions to permit quantitative validation of complex, spatially distributed flow and transport models. The authors also express their opinion that, for collecting adequate data from natural analogs, resources will have to be devoted to them that are much larger than are devoted to them at present.« less

  11. iT2DMS: a Standard-Based Diabetic Disease Data Repository and its Pilot Experiment on Diabetic Retinopathy Phenotyping and Examination Results Integration.

    PubMed

    Wu, Huiqun; Wei, Yufang; Shang, Yujuan; Shi, Wei; Wang, Lei; Li, Jingjing; Sang, Aimin; Shi, Lili; Jiang, Kui; Dong, Jiancheng

    2018-06-06

    Type 2 diabetes mellitus (T2DM) is a common chronic disease, and the fragment data collected through separated vendors makes continuous management of DM patients difficult. The lack of standard of fragment data from those diabetic patients also makes the further potential phenotyping based on the diabetic data difficult. Traditional T2DM data repository only supports data collection from T2DM patients, lack of phenotyping ability and relied on standalone database design, limiting the secondary usage of these valuable data. To solve these issues, we proposed a novel T2DM data repository framework, which was based on standards. This repository can integrate data from various sources. It would be used as a standardized record for further data transfer as well as integration. Phenotyping was conducted based on clinical guidelines with KNIME workflow. To evaluate the phenotyping performance of the proposed system, data was collected from local community by healthcare providers and was then tested using algorithms. The results indicated that the proposed system could detect DR cases with an average accuracy of about 82.8%. Furthermore, these results had the promising potential of addressing fragmented data. The proposed system has integrating and phenotyping abilities, which could be used for diabetes research in future studies.

  12. Thermal–hydraulic–mechanical modeling of a large-scale heater test to investigate rock salt and crushed salt behavior under repository conditions for heat-generating nuclear waste

    DOE PAGES

    Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny; ...

    2016-04-28

    The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.

  13. Thermal–hydraulic–mechanical modeling of a large-scale heater test to investigate rock salt and crushed salt behavior under repository conditions for heat-generating nuclear waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny

    The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.

  14. MetaboLights: An Open-Access Database Repository for Metabolomics Data.

    PubMed

    Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph

    2016-03-24

    MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.

  15. Dynameomics: data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction.

    PubMed

    Rysavy, Steven J; Beck, David A C; Daggett, Valerie

    2014-11-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼ 25-75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. © 2014 The Protein Society.

  16. Dynameomics: Data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction

    PubMed Central

    Rysavy, Steven J; Beck, David AC; Daggett, Valerie

    2014-01-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼25–75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. PMID:25142412

  17. Modeling radionuclide migration from underground nuclear explosions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harp, Dylan Robert; Stauffer, Philip H.; Viswanathan, Hari S.

    2017-03-06

    The travel time of radionuclide gases to the ground surface in fracture rock depends on many complex factors. Numerical simulators are the most complete repositories of knowledge of the complex processes governing radionuclide gas migration to the ground surface allowing us to verify conceptualizations of physical processes against observations and forecast radionuclide gas travel times to the ground surface and isotopic ratios

  18. ArrayWiki: an enabling technology for sharing public microarray data repositories and meta-analyses

    PubMed Central

    Stokes, Todd H; Torrance, JT; Li, Henry; Wang, May D

    2008-01-01

    Background A survey of microarray databases reveals that most of the repository contents and data models are heterogeneous (i.e., data obtained from different chip manufacturers), and that the repositories provide only basic biological keywords linking to PubMed. As a result, it is difficult to find datasets using research context or analysis parameters information beyond a few keywords. For example, to reduce the "curse-of-dimension" problem in microarray analysis, the number of samples is often increased by merging array data from different datasets. Knowing chip data parameters such as pre-processing steps (e.g., normalization, artefact removal, etc), and knowing any previous biological validation of the dataset is essential due to the heterogeneity of the data. However, most of the microarray repositories do not have meta-data information in the first place, and do not have a a mechanism to add or insert this information. Thus, there is a critical need to create "intelligent" microarray repositories that (1) enable update of meta-data with the raw array data, and (2) provide standardized archiving protocols to minimize bias from the raw data sources. Results To address the problems discussed, we have developed a community maintained system called ArrayWiki that unites disparate meta-data of microarray meta-experiments from multiple primary sources with four key features. First, ArrayWiki provides a user-friendly knowledge management interface in addition to a programmable interface using standards developed by Wikipedia. Second, ArrayWiki includes automated quality control processes (caCORRECT) and novel visualization methods (BioPNG, Gel Plots), which provide extra information about data quality unavailable in other microarray repositories. Third, it provides a user-curation capability through the familiar Wiki interface. Fourth, ArrayWiki provides users with simple text-based searches across all experiment meta-data, and exposes data to search engine crawlers (Semantic Agents) such as Google to further enhance data discovery. Conclusions Microarray data and meta information in ArrayWiki are distributed and visualized using a novel and compact data storage format, BioPNG. Also, they are open to the research community for curation, modification, and contribution. By making a small investment of time to learn the syntax and structure common to all sites running MediaWiki software, domain scientists and practioners can all contribute to make better use of microarray technologies in research and medical practices. ArrayWiki is available at . PMID:18541053

  19. Energy Dissipation in Calico Hills Tuff due to Pore Collapse

    NASA Astrophysics Data System (ADS)

    Lockner, D. A.; Morrow, C. A.

    2008-12-01

    Laboratory tests indicate that the weakest portions of the Calico Hills tuff formation are at or near yield stress under in situ conditions and that the energy expended during incremental loading can be more than 90 percent irrecoverable. The Calico Hills tuff underlies the Yucca Mountain waste repository site at a depth of 400 to 500 m within the unsaturated zone. The formation is highly variable in the degree of both vitrification and zeolitization. Since 1980, a number of boreholes penetrated this formation to provide site characterization for the YM repository. In the past, standard strength measurements were conducted on core samples from the drillholes. However, a significant sampling bias occurred in that tests were preferentially conducted on highly vitrified, higher-strength samples. In fact, the most recent holes were drilled with a dry coring technique that would pulverize the weakest layers, leaving none of this material for testing. We have re-examined Calico Hills samples preserved at the YM Core Facility and selected the least vitrified examples (some cores exceeded 50 percent porosity) for mechanical testing. Three basic tests were performed: (i) hydrostatic crushing tests (to 350 MPa), (ii) standard triaxial deformation tests at constant effective confining pressure (to 70 MPa), and (iii) plane strain tests with initial conditions similar to in situ stresses. In all cases, constant pore pressure of 10 MPa was maintained using argon gas as a pore fluid and pore volume loss was monitored during deformation. The strongest samples typically failed along discrete fractures in agreement with standard Mohr-Coulomb failure. The weaker, high porosity samples, however, would fail by pure pore collapse or by a combined shear-induced compaction mechanism similar to failure mechanisms described for porous sandstones and carbonates. In the plane-strain experiments, energy dissipation due to pore collapse was determined for eventual input into dynamic wave calculations. These calculations will simulate ground accelerations at the YM repository due to propagation of high-amplitude compressional waves generated by scenario earthquakes. As an example, in one typical test on a sample with 43 percent starting porosity, an axial stress increase of 25 MPa resulted from 6 percent shortening and energy dissipation (due to grain crushing and pore collapse) of approximately 1.5x106 J/m3. Under proper conditions, this dissipation mechanism could represent a significant absorption of radiated seismic energy and the possible shielding of the repository from extreme ground shaking.

  20. Particle tracking approach for transport in three-dimensional discrete fracture networks: Particle tracking in 3-D DFNs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makedonska, Nataliia; Painter, Scott L.; Bui, Quan M.

    The discrete fracture network (DFN) model is a method to mimic discrete pathways for fluid flow through a fractured low-permeable rock mass, and may be combined with particle tracking simulations to address solute transport. However, experience has shown that it is challenging to obtain accurate transport results in three-dimensional DFNs because of the high computational burden and difficulty in constructing a high-quality unstructured computational mesh on simulated fractures. We present a new particle tracking capability, which is adapted to control volume (Voronoi polygons) flow solutions on unstructured grids (Delaunay triangulations) on three-dimensional DFNs. The locally mass-conserving finite-volume approach eliminates massmore » balance-related problems during particle tracking. The scalar fluxes calculated for each control volume face by the flow solver are used to reconstruct a Darcy velocity at each control volume centroid. The groundwater velocities can then be continuously interpolated to any point in the domain of interest. The control volumes at fracture intersections are split into four pieces, and the velocity is reconstructed independently on each piece, which results in multiple groundwater velocities at the intersection, one for each fracture on each side of the intersection line. This technique enables detailed particle transport representation through a complex DFN structure. Verified for small DFNs, the new simulation capability enables numerical experiments on advective transport in large DFNs to be performed. As a result, we demonstrate this particle transport approach on a DFN model using parameters similar to those of crystalline rock at a proposed geologic repository for spent nuclear fuel in Forsmark, Sweden.« less

  1. Particle tracking approach for transport in three-dimensional discrete fracture networks: Particle tracking in 3-D DFNs

    DOE PAGES

    Makedonska, Nataliia; Painter, Scott L.; Bui, Quan M.; ...

    2015-09-16

    The discrete fracture network (DFN) model is a method to mimic discrete pathways for fluid flow through a fractured low-permeable rock mass, and may be combined with particle tracking simulations to address solute transport. However, experience has shown that it is challenging to obtain accurate transport results in three-dimensional DFNs because of the high computational burden and difficulty in constructing a high-quality unstructured computational mesh on simulated fractures. We present a new particle tracking capability, which is adapted to control volume (Voronoi polygons) flow solutions on unstructured grids (Delaunay triangulations) on three-dimensional DFNs. The locally mass-conserving finite-volume approach eliminates massmore » balance-related problems during particle tracking. The scalar fluxes calculated for each control volume face by the flow solver are used to reconstruct a Darcy velocity at each control volume centroid. The groundwater velocities can then be continuously interpolated to any point in the domain of interest. The control volumes at fracture intersections are split into four pieces, and the velocity is reconstructed independently on each piece, which results in multiple groundwater velocities at the intersection, one for each fracture on each side of the intersection line. This technique enables detailed particle transport representation through a complex DFN structure. Verified for small DFNs, the new simulation capability enables numerical experiments on advective transport in large DFNs to be performed. As a result, we demonstrate this particle transport approach on a DFN model using parameters similar to those of crystalline rock at a proposed geologic repository for spent nuclear fuel in Forsmark, Sweden.« less

  2. Evaluation of geotechnical monitoring data from the ESF North Ramp Starter Tunnel, April 1994 to June 1995. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less

  3. Preliminary safety evaluation of an aircraft impact on a near-surface radioactive waste repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo Frano, R.; Forasassi, G.; Pugliese, G.

    2013-07-01

    The aircraft impact accident has become very significant in the design of a nuclear facilities, particularly, after the tragic September 2001 event, that raised the public concern about the potential damaging effects that the impact of a large civilian airplane could bring in safety relevant structures. The aim of this study is therefore to preliminarily evaluate the global response and the structural effects induced by the impact of a military or commercial airplane (actually considered as a 'beyond design basis' event) into a near surface radioactive waste (RWs) disposal facility. The safety evaluation was carried out according to the Internationalmore » safety and design guidelines and in agreement with the stress tests requirements for the security track. To achieve the purpose, a lay out and a scheme of a possible near surface repository, like for example those of the El Cabril one, were taken into account. In order to preliminarily perform a reliable analysis of such a large-scale structure and to determine the structural effects induced by such a types of impulsive loads, a realistic, but still operable, numerical model with suitable materials characteristics was implemented by means of FEM codes. In the carried out structural analyses, the RWs repository was considered a 'robust' target, due to its thicker walls and main constitutive materials (steel and reinforced concrete). In addition to adequately represent the dynamic response of repository under crashing, relevant physical phenomena (i.e. penetration, spalling, etc.) were simulated and analysed. The preliminary assessment of the effects induced by the dynamic/impulsive loads allowed generally to verify the residual strength capability of the repository considered. The obtained preliminary results highlighted a remarkable potential to withstand the impact of military/large commercial aircraft, even in presence of ongoing concrete progressive failure (some penetration and spalling of the concrete wall) of the impacted area. (authors)« less

  4. KiMoSys: a web-based repository of experimental data for KInetic MOdels of biological SYStems

    PubMed Central

    2014-01-01

    Background The kinetic modeling of biological systems is mainly composed of three steps that proceed iteratively: model building, simulation and analysis. In the first step, it is usually required to set initial metabolite concentrations, and to assign kinetic rate laws, along with estimating parameter values using kinetic data through optimization when these are not known. Although the rapid development of high-throughput methods has generated much omics data, experimentalists present only a summary of obtained results for publication, the experimental data files are not usually submitted to any public repository, or simply not available at all. In order to automatize as much as possible the steps of building kinetic models, there is a growing requirement in the systems biology community for easily exchanging data in combination with models, which represents the main motivation of KiMoSys development. Description KiMoSys is a user-friendly platform that includes a public data repository of published experimental data, containing concentration data of metabolites and enzymes and flux data. It was designed to ensure data management, storage and sharing for a wider systems biology community. This community repository offers a web-based interface and upload facility to turn available data into publicly accessible, centralized and structured-format data files. Moreover, it compiles and integrates available kinetic models associated with the data. KiMoSys also integrates some tools to facilitate the kinetic model construction process of large-scale metabolic networks, especially when the systems biologists perform computational research. Conclusions KiMoSys is a web-based system that integrates a public data and associated model(s) repository with computational tools, providing the systems biology community with a novel application facilitating data storage and sharing, thus supporting construction of ODE-based kinetic models and collaborative research projects. The web application implemented using Ruby on Rails framework is freely available for web access at http://kimosys.org, along with its full documentation. PMID:25115331

  5. A Linked Dataset of Medical Educational Resources

    ERIC Educational Resources Information Center

    Dietze, Stefan; Taibi, Davide; Yu, Hong Qing; Dovrolis, Nikolas

    2015-01-01

    Reusable educational resources became increasingly important for enhancing learning and teaching experiences, particularly in the medical domain where resources are particularly expensive to produce. While interoperability across educational resources metadata repositories is yet limited to the heterogeneity of metadata standards and interface…

  6. 2012 best practices for repositories collection, storage, retrieval, and distribution of biological materials for research international society for biological and environmental repositories.

    PubMed

    2012-04-01

    Third Edition [Formula: see text] [Box: see text] Printed with permission from the International Society for Biological and Environmental Repositories (ISBER) © 2011 ISBER All Rights Reserved Editor-in-Chief Lori D. Campbell, PhD Associate Editors Fay Betsou, PhD Debra Leiolani Garcia, MPA Judith G. Giri, PhD Karen E. Pitt, PhD Rebecca S. Pugh, MS Katherine C. Sexton, MBA Amy P.N. Skubitz, PhD Stella B. Somiari, PhD Individual Contributors to the Third Edition Jonas Astrin, Susan Baker, Thomas J. Barr, Erica Benson, Mark Cada, Lori Campbell, Antonio Hugo Jose Froes Marques Campos, David Carpentieri, Omoshile Clement, Domenico Coppola, Yvonne De Souza, Paul Fearn, Kelly Feil, Debra Garcia, Judith Giri, William E. Grizzle, Kathleen Groover, Keith Harding, Edward Kaercher, Joseph Kessler, Sarah Loud, Hannah Maynor, Kevin McCluskey, Kevin Meagher, Cheryl Michels, Lisa Miranda, Judy Muller-Cohn, Rolf Muller, James O'Sullivan, Karen Pitt, Rebecca Pugh, Rivka Ravid, Katherine Sexton, Ricardo Luis A. Silva, Frank Simione, Amy Skubitz, Stella Somiari, Frans van der Horst, Gavin Welch, Andy Zaayenga 2012 Best Practices for Repositories: Collection, Storage, Retrieval and Distribution of Biological Materials for Research INTERNATIONAL SOCIETY FOR BIOLOGICAL AND ENVIRONMENTAL REPOSITORIES (ISBER) INTRODUCTION T he availability of high quality biological and environmental specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens that will enable their future use. Sharing successful strategies for accomplishing this goal is one of the driving forces for the International Society for Biological and Environmental Repositories (ISBER). For more information about ISBER see www.isber.org . ISBER's Best Practices for Repositories (Best Practices) reflect the collective experience of its members and has received broad input from other repository professionals. Throughout this document effective practices are presented for the management of specimen collections and repositories. The term "Best Practice" is used in cases where a level of operation is indicated that is above the basic recommended practice or more specifically designates the most effective practice. It is understood that repositories in certain locations or with particular financial constraints may not be able to adhere to each of the items designated as "Best Practices". Repositories fitting into either of these categories will need to decide how they might best adhere to these recommendations within their particular circumstances. While adherence to ISBER Best Practices is strictly on a voluntary basis, it is important to note that some aspects of specimen management are governed by national/federal, regional and local regulations. The reader should refer directly to regulations for their national/federal, regional and local requirements, as appropriate. ISBER has strived to include terminology appropriate to the various specimen types covered under these practices, but here too, the reader should take steps to ensure the appropriateness of the recommendations to their particular repository type prior to the implementation of any new approaches. Important terms within the document are italicized when first used in a section and defined in the glossary. The ISBER Best Practices are periodically reviewed and revised to reflect advances in research and technology. The third edition of the Best Practices builds on the foundation established in the first and second editions which were published in 2005 and 2008, respectively.

  7. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    PubMed Central

    Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J.; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Lötstedt, Per; Petzold, Linda R.

    2016-01-01

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity. PMID:27930676

  8. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    DOE PAGES

    Drawert, Brian; Hellander, Andreas; Bales, Ben; ...

    2016-12-08

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources andmore » exchange models via a public model repository. We also demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.« less

  9. Data-intensive science gateway for rock physicists and volcanologists.

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Atkinson, Malcom; Bell, Andrew; Main, Ian; Boon, Steve; Meredith, Philp; Kilburn, Christopher

    2014-05-01

    Scientists have always shared data and mathematical models of the phenomena they study. Rock physics and Volcanology, as well as other solid-Earth sciences, have increasingly used Internet communications and computational renditions of their models for this purpose over the last two decades. Here we consider how to organise rock physics and volcanology data to open up opportunities for sharing and comparing both experiment data from experiments, observations and model runs and analytic interpretations of these data. Our hypothesis is that if we facilitate productive information sharing across those communities by using a new science gateway, it will benefit the science. The proposed science gateway should make the first steps for making existing research practices easier and facilitate new research. It will achieve this by supporting three major functions: 1) sharing data from laboratories and observatories, experimental facilities and models; 2) sharing models of rock fracture and methods for analysing experimental data; and 3) supporting recurrent operational tasks, such as data collection and model application in real time. We report initial work in two projects (NERC EFFORT and NERC CREEP-2) and experience with an early web-accessible protytpe called EFFORT gateway, where we are implementing such information sharing services for those projects. 1. Sharing data: In EFFORT gateway, we are working on several facilities for sharing data: *Upload data: We have designed and developed a new adaptive data transfer java tool called FAST (Flexible Automated Streaming Transfer) to upload experimental data and metadata periodically from laboratories to our repository. *Visualisation: As data are deposited in the repository, a visualisation of the accumulated data is made available for display in the Web portal. *Metadata and catalogues: The gateway uses a repository to hold all the data and a catalogue to hold all the corresponding metadata. 2. Sharing models and methods: The EFFORT gateway uses a repository to hold all of the models and a catalogue to hold the corresponding metadata. It provides several Web facilities for uploading, accessing and testing models. *Upload and store models: Through the gateway, researchers can upload as many models to the repository as they want. *Description of models: The gateway solicits and creates metadata for every model uploaded to store in the catalogue. *Search for models: Researchers can search the catalogue for models by using prepackaged sql-queries. *Access to models: Once a researcher has selected the model(s) that is going to be used for analysing an experiment, it will be obtained from the gateway. *Services to test and run models: Once a researcher selects a model and the experimental data to which it should be applied, the gateway submits the corresponding computational job to a high-performance computational (HPC) resource hiding technical details. Once a job is submitted to the HPC cluster, the results are displayed in the gateway in real time, catalogued and stored in the data repository, allowing further researcher-instigated operations to retrieve, inspect and aggregate results. *Services to write models: We have desgined VarPy library, which is an open-source toolbox which provides a Python framework for analysing volcanology and rock physics data. It provides several functions, which allow users to define their own workflows to develop models, analyses and visualizations. 3. Recurrent Operations: We have started to introduce some recurrent operations: *Automated data upload: FAST provides a mechanism to automate the data upload. *Periodic activation of models: The EFFORT gateway allows researchers to run different models periodically against the experimental data that are being or have been uploaded

  10. Modeling early in situ wetting of a compacted bentonite buffer installed in low permeable crystalline bedrock

    NASA Astrophysics Data System (ADS)

    Dessirier, B.; Frampton, A.; Fransson, À.; Jarsjö, J.

    2016-08-01

    The repository concept for geological disposal of spent nuclear fuel in Sweden and Finland is planned to be constructed in sparsely fractured crystalline bedrock and with an engineered bentonite buffer to embed the waste canisters. An important stage in such a deep repository is the postclosure phase following the deposition and the backfilling operations when the initially unsaturated buffer material gets hydrated by the groundwater delivered by the natural bedrock. We use numerical simulations to interpret observations on buffer wetting gathered during an in situ campaign, the Bentonite Rock Interaction Experiment, in which unsaturated bentonite columns were introduced into deposition holes in the floor of a 417 m deep tunnel at the Äspö Hard Rock Laboratory in Sweden. Our objectives are to assess the performance of state-of-the-art flow models in reproducing the buffer wetting process and to investigate to which extent dependable predictions of buffer wetting times and saturation patterns can be made based on information collected prior to buffer insertion. This would be important for preventing insertion into unsuitable bedrock environments. Field data and modeling results indicate the development of a de-saturated zone in the rock and show that in most cases, the presence or absence of fractures and flow heterogeneity are more important factors for correct wetting predictions than the total inflow. For instance, for an equal open-hole inflow value, homogeneous inflow yields much more rapid buffer wetting than cases where fractures are represented explicitly thus creating heterogeneous inflow distributions.

  11. Corrosion behavior of Alloy 22 in heated surface test conditions in simulated Yucca Mountain Nuclear Repository environment

    NASA Astrophysics Data System (ADS)

    Badwe, Sunil

    In the nuclear repository conditions, the nuclear waste package wall surfaces will be at elevated temperatures because of the heat generated by fission reactions within the waste. It is anticipated that the ground water may contain varying levels of anions such as chloride, nitrate, sulfate picked up from the rocks. The ground waters could seep through the rock faults and drip on to the waste packages. The dripped water will evaporate due to the heat from the nuclear waste leaving behind concentrated brine which eventually becomes dry salt deposit. The multi-ionic salts in the ground water are expected to be hygroscopic in nature. The next drop of water falling at the same place or the humidity in the repository will transform the hygroscopic salt deposit into a more concentrated brine. This cycle will continue for years and eventually a potentially corrosive brine will be formed on the waste package surface. Hence the waste package surface goes through the alternate wet-dry cycles. These conditions indicate that the concentration and pH of the environment in the repository vary considerably. The conventional corrosion tests hardly simulate these varying environmental conditions. Hence there has been a need to develop an electrochemical test that could closely simulate the anticipated repository conditions stated above. In this research, a new electrochemical method, called as Heated Surface Corrosion testing (HSCT) has been devised and tested. In the conventional testing the electrolyte is heated and in HSCT the working electrode is heated. The present study employs the temperature of 80°C which may be one of the temperatures of the waste package surface. The new HSCT was validated by testing stainless steel type 304. The HSCT was observed to be more aggressive than the conventional tests. Initiation of pitting of SS 304 in chloride solution (pH 3) occurred at much shorter exposure times in the HSCT condition than the exposure time required for pitting in conventional testing. The reduced time to pitting demonstrated the capability of HSCT to impose repository more corrosive conditions. The stability of the passive film of stainless alloys under the hygroscopic salt layers could be determined using this technique. Alloy 22, a nickel base Ni-22Cr-13Mo-3W alloy has an excellent corrosion resistance in oxidizing and reducing environments. Corrosion behavior of Alloy 22 was evaluated using the newly devised HSCT method in simulated acidified water (SAW), simulated concentrated water (SCW) and in pure chloride (pH 3 and 8) environments. In this method, the concentration of the environment varied with test duration. Alloy 22 was evaluated in four different heat treated conditions viz. (a) mill annealed, (b) 610°C/1 h-representing Cr depletion, (c) 650°C/100 h-representing Mo+Cr depletion, (d) 800°C/100 h-representing Mo depletion. The corrosion rate of mill annealed Alloy 22 was not affected by the continuous increase in ionic strength of the SAW (pH 3) environment. Passivation kinetics was faster with increase in concentration of the electrolytes. The major difference between the conventional test and HSCT was the aging characteristics of the passive film of Alloy 22. Cyclic polarization was carried out on Alloy 22 in conventional ASTM G61 and HSCT method to compare. The electrochemical response of Alloy 22 was the same by heating the electrolyte or heating the electrode. The corrosion behavior of Alloy 22 was investigated in three different aged conditions using HSCT approach in two different electrolytes. The thermal aging conditions of the specimens introduced depletion of chromium and molybdenum near the grain boundaries/phase boundaries. Long-term exposure tests (up to 850 h) were conducted in simulated acidified water (SAW, pH 3) and simulated concentrated water (SCW, pH 8) at 80°C. Corrosion potential, corrosion current and passive current decay exponent were determined at regular intervals. The specimens aged at 610°C/1 h and 800°C/100 h showed almost identical corrosion behaviors in the SAW environment. The specimen aged at 650°C/100 h showed lower corrosion resistance in the SAW environment indicating the effect of Mo-depletion profile near the grain boundaries. The specimen aged at 800°C for 100 h showed lower corrosion resistance in the SCW environment because of possible dissolution of the Mo-rich precipitates. Compared to the mill annealed condition, the aged specimens showed approximately an order of magnitude higher corrosion current in the SAW environment and almost similar corrosion currents in the SCW environment. Results also indicate that the passivity of Alloy 22, both in mill annealed and in aged conditions was not hampered during dry-out/rewet cycles. Presence of nitrate and other oxyanions in the SAW environment reduced the charge required to form a stable passive film of alloy 22 aged samples as compared to the charge passed in the pure chloride pH 3 environments. The passive film of the aged Alloy 22 specimens exposed to pure chloride solutions showed predominantly n-type semiconducting behavior and the on-set of p-type semiconductivity at higher potentials. The charge carrier density of the passive film of Alloy 22 varied in the range 1.5-9.0 x 10 21/cm3. The predominant charge carriers could be oxygen vacancies. Increase in the charge carrier density was observed in the specimen aged at 800°C/100 h when exposed to pH 3 solution as compared to exposure in pH 8 solution. In Summary, Alloy 22 sustained the heated surface corrosion test without any appreciable surface attack in the simulated repository environments as well as the more corrosive chloride environments.

  12. Use of a Dual-Structure Constitutive Model for Predicting the Long-Term Behavior of an Expansive Clay Buffer in a Nuclear Waste Repository

    DOE PAGES

    Vilarrasa, Víctor; Rutqvist, Jonny; Blanco Martin, Laura; ...

    2015-12-31

    Expansive soils are suitable as backfill and buffer materials in engineered barrier systems to isolate heat-generating nuclear waste in deep geological formations. The canisters containing nuclear waste would be placed in tunnels excavated at a depth of several hundred meters. The expansive soil should provide enough swelling capacity to support the tunnel walls, thereby reducing the impact of the excavation-damaged zone on the long-term mechanical and flow-barrier performance. In addition to their swelling capacity, expansive soils are characterized by accumulating irreversible strain on suction cycles and by effects of microstructural swelling on water permeability that for backfill or buffer materialsmore » can significantly delay the time it takes to reach full saturation. In order to simulate these characteristics of expansive soils, a dual-structure constitutive model that includes two porosity levels is necessary. The authors present the formulation of a dual-structure model and describe its implementation into a coupled fluid flow and geomechanical numerical simulator. The authors use the Barcelona Basic Model (BBM), which is an elastoplastic constitutive model for unsaturated soils, to model the macrostructure, and it is assumed that the strains of the microstructure, which are volumetric and elastic, induce plastic strain to the macrostructure. The authors tested and demonstrated the capabilities of the implemented dual-structure model by modeling and reproducing observed behavior in two laboratory tests of expansive clay. As observed in the experiments, the simulations yielded nonreversible strain accumulation with suction cycles and a decreasing swelling capacity with increasing confining stress. Finally, the authors modeled, for the first time using a dual-structure model, the long-term (100,000 years) performance of a generic heat-generating nuclear waste repository with waste emplacement in horizontal tunnels backfilled with expansive clay and hosted in a clay rock formation. The thermo-hydro-mechanical results of the dual-structure model were compared with those of the standard single-structure BBM. The main difference between the simulation results from the two models is that the dual-structure model predicted a time to fully saturate the expansive clay barrier on the order of thousands of years, whereas the standard single-structure BBM yielded a time on the order of tens of years. These examples show that a dual-structure model, such as the one presented here, is necessary to properly model the thermo-hydro-mechanical behavior of expansive soils.« less

  13. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  14. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  15. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  16. 10 CFR 63.131 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...

  17. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  18. 10 CFR 63.131 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...

  19. 10 CFR 63.131 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...

  20. 10 CFR 63.131 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...

  1. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  2. 10 CFR 63.131 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA... conditions encountered and changes in those conditions during construction and waste emplacement operations...

  3. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines.

    PubMed

    Dubrowski, Adam; Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-11-02

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation - occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation - collectively, the building blocks of optimal healthcare.

  4. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines

    PubMed Central

    Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-01-01

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation – occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation – collectively, the building blocks of optimal healthcare.  PMID:26677421

  5. Effects of experimental parameters on the sorption of cesium, strontium, and uranium from saline groundwaters onto shales: Progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, R.E.; Arnold, W.D.; Case, F.I.

    1988-11-01

    This report concerns an extension of the first series of experiments on the sorption properties of shales and their clay mineral components reported earlier. Studies on the sorption of cesium and strontium were carried out on samples of Chattanooga (Upper Dowelltown), Pierre, Green River Formation, Nolichucky, and Pumpkin Valley Shales that had been heated to 120/degree/C in a 0.1-mol/L NaCl solution for periods up to several months and on samples of the same shales which had been heated to 250/degree/C in air for six months, to simulate limiting scenarios in a HLW repository. To investigate the kinetics of the sorptionmore » process in shale/groundwater systems, strontium sorption experiments were done on unheated Pierre, Green River Formation, Nolichucky, and Pumpkin Valley Shales in a diluted, saline groundwater and in 0.03-mol/L NaHCO/sub 3/, for periods of 0.25 to 28 days. Cesium sorption kinetics tests were performed on the same shales in a concentrated brine for the same time periods. The effect of the water/rock (W/R) ratio on sorption for the same combinations of unheated shales, nuclides, and groundwaters used in the kinetics experiments was investigated for a range of W/R ratios of 3 to 20 mL/g. Because of the complexity of the shale/groundwater interaction, a series of tests was conducted on the effects of contact time and W/R ratio on the pH of a 0.03-mol/L NaHCO/sub 3/ simulated groundwater in contact with shales. 8 refs., 12 figs., 15 tabs.« less

  6. Detecting unresolved binary stars in Euclid VIS images

    NASA Astrophysics Data System (ADS)

    Kuntzer, T.; Courbin, F.

    2017-10-01

    Measuring a weak gravitational lensing signal to the level required by the next generation of space-based surveys demands exquisite reconstruction of the point-spread function (PSF). However, unresolved binary stars can significantly distort the PSF shape. In an effort to mitigate this bias, we aim at detecting unresolved binaries in realistic Euclid stellar populations. We tested methods in numerical experiments where (I) the PSF shape is known to Euclid requirements across the field of view; and (II) the PSF shape is unknown. We drew simulated catalogues of PSF shapes for this proof-of-concept paper. Following the Euclid survey plan, the objects were observed four times. We propose three methods to detect unresolved binary stars. The detection is based on the systematic and correlated biases between exposures of the same object. One method is a simple correlation analysis, while the two others use supervised machine-learning algorithms (random forest and artificial neural network). In both experiments, we demonstrate the ability of our methods to detect unresolved binary stars in simulated catalogues. The performance depends on the level of prior knowledge of the PSF shape and the shape measurement errors. Good detection performances are observed in both experiments. Full complexity, in terms of the images and the survey design, is not included, but key aspects of a more mature pipeline are discussed. Finding unresolved binaries in objects used for PSF reconstruction increases the quality of the PSF determination at arbitrary positions. We show, using different approaches, that we are able to detect at least binary stars that are most damaging for the PSF reconstruction process. The code corresponding to the algorithms used in this work and all scripts to reproduce the results are publicly available from a GitHub repository accessible via http://lastro.epfl.ch/software

  7. Transportation needs assessment: Emergency response section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The transportation impacts of moving high level nuclear waste (HLNW) to a repository at Yucca Mountain in Nevada are of concern to the residents of the State as well as to the residents of other states through which the nuclear wastes might be transported. The projected volume of the waste suggests that shipments will occur on a daily basis for some period of time. This will increase the risk of accidents, including a catastrophic incident. Furthermore, as the likelihood of repository construction and operation and waste shipments increase, so will the attention given by the national media. This document ismore » not to be construed as a willingness to accept the HLNW repository on the part of the State. Rather it is an initial step in ensuring that the safety and well-being of Nevada residents and visitors and the State`s economy will be adequately addressed in federal decision-making pertaining to the transportation of HLNW into and across Nevada for disposal in the proposed repository. The Preferred Transportation System Needs Assessment identifies critical system design elements and technical and social issues that must be considered in conducting a comprehensive transportation impact analysis. Development of the needs assessment and the impact analysis is especially complex because of the absence of information and experience with shipping HLNW and because of the ``low probability, high consequence`` aspect of the transportation risk.« less

  8. Thermal-Hydraulic-Mechanical (THM) Coupled Simulation of a Generic Site for Disposal of High Level Nuclear Waste in Claystone in Germany: Exemplary Proof of the Integrity of the Geological Barrier

    NASA Astrophysics Data System (ADS)

    Massmann, J.; Ziefle, G.; Jobmann, M.

    2016-12-01

    Claystone is investigated as a potential host rock for the disposal of high level nuclear waste (HLW). In Germany, DBE TECHNOLOGY GmbH, the BGR and the "Gesellschaft für Anlagen- und Reaktorsicherheit (GRS)" are developing an integrated methodology for safety assessment within the R&D project "ANSICHT". One part herein is the demonstration of integrity of the geological barrier to ensure safe containment of radionuclides over 1 million years. The mechanical excavation of an underground repository, the ex­po­si­tion of claystone to at­mos­pheric air, the insertion of backfill, buffer, sealing and supporting material as well as the deposition of heat producing waste constitute a sig­nif­i­cant disturbance of the underground system. A complex interacting scheme of thermal, hydraulic and mechanical (THM) processes can be expected. In this work, the finite element software OpenGeoSys, main­ly de­vel­oped at the "Helmholtz Centre for Environmental Research GmbH (UFZ)", is used to simulate and evaluate several THM coupled effects in the repository surroundings up to the surface over a time span of 1 million years. The numerical setup is based on two generic geological models inspired by the representative geology of potentially suitable regions in North- and South Germany. The results give an insight into the evolution of temperature, pore pressure, stresses as well as deformation and enables statements concerning the extent of the significantly influenced area. One important effect among others is the temperature driven change in the densities of the solid and liquid phase and its influence on the stress field. In a further step, integrity criteria have been quantified, based on specifications of the German federal ministry of the environment. The exemplary numerical evaluation of these criteria demonstrates, how numerical simulations can be used to prove the integrity of the geological barrier and detect potential vulnerabilities. Fig.: Calculated zone of increased temperature (blue bubble) around a generic repository of HLW in a representative geological setting, 1000 years after emplacement of HLW

  9. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  10. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  11. The Geodetic Seamless Archive Centers Service Layer: A System Architecture for Federating Geodesy Data Repositories

    NASA Astrophysics Data System (ADS)

    McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.

    2010-12-01

    Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.

  12. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  13. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  14. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  15. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  16. Radionuclide transport behavior in a generic geological radioactive waste repository.

    PubMed

    Bianchi, Marco; Liu, Hui-Hai; Birkholzer, Jens T

    2015-01-01

    We performed numerical simulations of groundwater flow and radionuclide transport to study the influence of several factors, including the ambient hydraulic gradient, groundwater pressure anomalies, and the properties of the excavation damaged zone (EDZ), on the prevailing transport mechanism (i.e., advection or molecular diffusion) in a generic nuclear waste repository within a clay-rich geological formation. By comparing simulation results, we show that the EDZ plays a major role as a preferential flowpath for radionuclide transport. When the EDZ is not taken into account, transport is dominated by molecular diffusion in almost the totality of the simulated domain, and transport velocity is about 40% slower. Modeling results also show that a reduction in hydraulic gradient leads to a greater predominance of diffusive transport, slowing down radionuclide transport by about 30% with respect to a scenario assuming a unit gradient. In addition, inward flow caused by negative pressure anomalies in the clay-rich formation further reduces transport velocity, enhancing the ability of the geological barrier to contain the radioactive waste. On the other hand, local high gradients associated with positive pressure anomalies can speed up radionuclide transport with respect to steady-state flow systems having the same regional hydraulic gradients. Transport behavior was also found to be sensitive to both geometrical and hydrogeological parameters of the EDZ. Results from this work can provide useful knowledge toward correctly assessing the post-closure safety of a geological disposal system. © 2014, National Ground Water Association.

  17. Accelerated Weathering of Fluidized Bed Steam Reformation Material Under Hydraulically Unsaturated Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, Eric M.

    2007-09-16

    To predict the long-term fate of low- and high-level waste forms in the subsurface over geologic time scales, it is important to understand the behavior of the corroding waste forms under conditions the mimic to the open flow and transport properties of a subsurface repository. Fluidized bed steam reformation (FBSR), a supplemental treatment technology option, is being considered as a waste form for the immobilization of low-activity tank waste. To obtain the fundamental information needed to evaluate the behavior of the FBSR waste form under repository relevant conditions and to monitor the long-term behavior of this material, an accelerated weatheringmore » experiment is being conducted with the pressurized unsaturated flow (PUF) apparatus. Unlike other accelerated weathering test methods (product consistency test, vapor hydration test, and drip test), PUF experiments are conducted under hydraulically unsaturated conditions. These experiments are unique because they mimic the vadose zone environment and allow the corroding waste form to achieve its final reaction state. Results from this on-going experiment suggest the volumetric water content varied as a function of time and reached steady state after 160 days of testing. Unlike the volumetric water content, periodic excursions in the solution pH and electrical conductivity have been occurring consistently during the test. Release of elements from the column illustrates a general trend of decreasing concentration with increasing reaction time. Normalized concentrations of K, Na, P, Re (a chemical analogue for 99Tc), and S are as much as 1 × 104 times greater than Al, Cr, Si, and Ti. After more than 600 days of testing, the solution chemistry data collected to-date illustrate the importance of understanding the long-term behavior of the FBSR product under conditions that mimic the open flow and transport properties of a subsurface repository.« less

  18. Health professional learner attitudes and use of digital learning resources.

    PubMed

    Maloney, Stephen; Chamberlain, Michael; Morrison, Shane; Kotsanas, George; Keating, Jennifer L; Ilic, Dragan

    2013-01-16

    Web-based digital repositories allow educational resources to be accessed efficiently and conveniently from diverse geographic locations, hold a variety of resource formats, enable interactive learning, and facilitate targeted access for the user. Unlike some other learning management systems (LMS), resources can be retrieved through search engines and meta-tagged labels, and content can be streamed, which is particularly useful for multimedia resources. The aim of this study was to examine usage and user experiences of an online learning repository (Physeek) in a population of physiotherapy students. The secondary aim of this project was to examine how students prefer to access resources and which resources they find most helpful. The following data were examined using an audit of the repository server: (1) number of online resources accessed per day in 2010, (2) number of each type of resource accessed, (3) number of resources accessed during business hours (9 am to 5 pm) and outside business hours (years 1-4), (4) session length of each log-on (years 1-4), and (5) video quality (bit rate) of each video accessed. An online questionnaire and 3 focus groups assessed student feedback and self-reported experiences of Physeek. Students preferred the support provided by Physeek to other sources of educational material primarily because of its efficiency. Peak usage commonly occurred at times of increased academic need (ie, examination times). Students perceived online repositories as a potential tool to support lifelong learning and health care delivery. The results of this study indicate that today's health professional students welcome the benefits of online learning resources because of their convenience and usability. This represents a transition away from traditional learning styles and toward technological learning support and may indicate a growing link between social immersions in Internet-based connections and learning styles. The true potential for Web-based resources to support student learning is as yet unknown.

  19. Health Professional Learner Attitudes and Use of Digital Learning Resources

    PubMed Central

    Chamberlain, Michael; Morrison, Shane; Kotsanas, George; Keating, Jennifer L; Ilic, Dragan

    2013-01-01

    Background Web-based digital repositories allow educational resources to be accessed efficiently and conveniently from diverse geographic locations, hold a variety of resource formats, enable interactive learning, and facilitate targeted access for the user. Unlike some other learning management systems (LMS), resources can be retrieved through search engines and meta-tagged labels, and content can be streamed, which is particularly useful for multimedia resources. Objective The aim of this study was to examine usage and user experiences of an online learning repository (Physeek) in a population of physiotherapy students. The secondary aim of this project was to examine how students prefer to access resources and which resources they find most helpful. Methods The following data were examined using an audit of the repository server: (1) number of online resources accessed per day in 2010, (2) number of each type of resource accessed, (3) number of resources accessed during business hours (9 am to 5 pm) and outside business hours (years 1-4), (4) session length of each log-on (years 1-4), and (5) video quality (bit rate) of each video accessed. An online questionnaire and 3 focus groups assessed student feedback and self-reported experiences of Physeek. Results Students preferred the support provided by Physeek to other sources of educational material primarily because of its efficiency. Peak usage commonly occurred at times of increased academic need (ie, examination times). Students perceived online repositories as a potential tool to support lifelong learning and health care delivery. Conclusions The results of this study indicate that today’s health professional students welcome the benefits of online learning resources because of their convenience and usability. This represents a transition away from traditional learning styles and toward technological learning support and may indicate a growing link between social immersions in Internet-based connections and learning styles. The true potential for Web-based resources to support student learning is as yet unknown. PMID:23324800

  20. Laboratory E-Notebooks: A Learning Object-Based Repository

    ERIC Educational Resources Information Center

    Abari, Ilior; Pierre, Samuel; Saliah-Hassane, Hamadou

    2006-01-01

    During distributed virtual laboratory experiment sessions, a major problem is to be able to collect, store, manage and share heterogeneous data (intermediate results, analysis, annotations, etc) manipulated simultaneously by geographically distributed teammates composing a virtual team. The electronic notebook is a possible response to this…

  1. Combining partially ranked data in plant breeding and biology: II. Analysis with Rasch model.

    USDA-ARS?s Scientific Manuscript database

    Many years of breeding experiments, germplasm screening, and molecular biologic experimentation have generated volumes of sequence, genotype, and phenotype information that have been stored in public data repositories. These resources afford genetic and genomic researchers the opportunity to handle ...

  2. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations.

    PubMed

    Martínez-Romero, Marcos; O'Connor, Martin J; Shankar, Ravi D; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L; Gevaert, Olivier; Graybeal, John; Musen, Mark A

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository.

  3. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations

    PubMed Central

    Martínez-Romero, Marcos; O’Connor, Martin J.; Shankar, Ravi D.; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L.; Gevaert, Olivier; Graybeal, John; Musen, Mark A.

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository. PMID:29854196

  4. Experiments and Modeling to Support Field Test Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Peter Jacob; Bourret, Suzanne Michelle; Zyvoloski, George Anthony

    Disposition of heat-generating nuclear waste (HGNW) remains a continuing technical and sociopolitical challenge. We define HGNW as the combination of both heat generating defense high level waste (DHLW) and civilian spent nuclear fuel (SNF). Numerous concepts for HGNW management have been proposed and examined internationally, including an extensive focus on geologic disposal (c.f. Brunnengräber et al., 2013). One type of proposed geologic material is salt, so chosen because of its viscoplastic deformation that causes self-repair of damage or deformation induced in the salt by waste emplacement activities (Hansen and Leigh, 2011). Salt as a repository material has been tested atmore » several sites around the world, notably the Morsleben facility in Germany (c.f. Fahland and Heusermann, 2013; Wollrath et al., 2014; Fahland et al., 2015) and at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, NM. Evaluating the technical feasibility of a HGNW repository in salt is an ongoing process involving experiments and numerical modeling of many processes at many facilities.« less

  5. NCBI GEO: archive for high-throughput functional genomic data.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  6. 40 CFR 124.33 - Information repository.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Information repository. 124.33 Section... FOR DECISIONMAKING Specific Procedures Applicable to RCRA Permits § 124.33 Information repository. (a... basis, for an information repository. When assessing the need for an information repository, the...

  7. 10 CFR 60.130 - General considerations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.130 General... for a high-level radioactive waste repository at a geologic repository operations area, and an... geologic repository operations area, must include the principal design criteria for a proposed facility...

  8. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories

    PubMed Central

    Neu, Scott C.; Crawford, Karen L.; Toga, Arthur W.

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead. PMID:22470336

  9. Practical management of heterogeneous neuroimaging metadata by global neuroimaging data repositories.

    PubMed

    Neu, Scott C; Crawford, Karen L; Toga, Arthur W

    2012-01-01

    Rapidly evolving neuroimaging techniques are producing unprecedented quantities of digital data at the same time that many research studies are evolving into global, multi-disciplinary collaborations between geographically distributed scientists. While networked computers have made it almost trivial to transmit data across long distances, collecting and analyzing this data requires extensive metadata if the data is to be maximally shared. Though it is typically straightforward to encode text and numerical values into files and send content between different locations, it is often difficult to attach context and implicit assumptions to the content. As the number of and geographic separation between data contributors grows to national and global scales, the heterogeneity of the collected metadata increases and conformance to a single standardization becomes implausible. Neuroimaging data repositories must then not only accumulate data but must also consolidate disparate metadata into an integrated view. In this article, using specific examples from our experiences, we demonstrate how standardization alone cannot achieve full integration of neuroimaging data from multiple heterogeneous sources and why a fundamental change in the architecture of neuroimaging data repositories is needed instead.

  10. MODELING OF THE GROUNDWATER TRANSPORT AROUND A DEEP BOREHOLE NUCLEAR WASTE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lubchenko; M. Rodríguez-Buño; E.A. Bates

    2015-04-01

    The concept of disposal of high-level nuclear waste in deep boreholes drilled into crystalline bedrock is gaining renewed interest and consideration as a viable mined repository alternative. A large amount of work on conceptual borehole design and preliminary performance assessment has been performed by researchers at MIT, Sandia National Laboratories, SKB (Sweden), and others. Much of this work relied on analytical derivations or, in a few cases, on weakly coupled models of heat, water, and radionuclide transport in the rock. Detailed numerical models are necessary to account for the large heterogeneity of properties (e.g., permeability and salinity vs. depth, diffusionmore » coefficients, etc.) that would be observed at potential borehole disposal sites. A derivation of the FALCON code (Fracturing And Liquid CONvection) was used for the thermal-hydrologic modeling. This code solves the transport equations in porous media in a fully coupled way. The application leverages the flexibility and strengths of the MOOSE framework, developed by Idaho National Laboratory. The current version simulates heat, fluid, and chemical species transport in a fully coupled way allowing the rigorous evaluation of candidate repository site performance. This paper mostly focuses on the modeling of a deep borehole repository under realistic conditions, including modeling of a finite array of boreholes surrounded by undisturbed rock. The decay heat generated by the canisters diffuses into the host rock. Water heating can potentially lead to convection on the scale of thousands of years after the emplacement of the fuel. This convection is tightly coupled to the transport of the dissolved salt, which can suppress convection and reduce the release of the radioactive materials to the aquifer. The purpose of this work has been to evaluate the importance of the borehole array spacing and find the conditions under which convective transport can be ruled out as a radionuclide transport mechanism. Preliminary results show that modeling of the borehole array, including the surrounding rock, predicts convective flow in the system with physical velocities of the order of 10-5 km/yr over 105 years. This results in an escape length on the order of kilometers, which is comparable to the repository depth. However, a correct account of the salinity effects reduces convection velocity and escape length of the radionuclides from the repository.« less

  11. Multiscale Model Simulations of Temperature and Relative Humidity for the License Application of the Proposed Yucca Mountain Repository

    NASA Astrophysics Data System (ADS)

    Buscheck, T.; Glascoe, L.; Sun, Y.; Gansemer, J.; Lee, K.

    2003-12-01

    For the proposed Yucca Mountain geologic repository for high-level nuclear waste, the planned method of disposal involves the emplacement of cylindrical packages containing the waste inside horizontal tunnels, called emplacement drifts, bored several hundred meters below the ground surface. The emplacement drifts reside in highly fractured, partially saturated volcanic tuff. An important phenomenological consideration for the licensing of the proposed repository at Yucca Mountain is the generation of decay heat by the emplaced waste and the consequences of this decay heat. Changes in temperature will affect the hydrologic and chemical environment at Yucca Mountain. A thermohydrologic-modeling tool is necessary to support the performance assessment of the Engineered Barrier System (EBS) of the proposed repository. This modeling tool must simultaneously account for processes occurring at a scale of a few tens of centimeters around individual waste packages, for processes occurring around the emplacement drifts themselves, and for processes occurring at the multi-kilometer scale of the mountain. Additionally, many other features must be considered including non-isothermal, multiphase-flow in fractured porous rock of variable liquid-phase saturation and thermal radiation and convection in open cavities. The Multiscale Thermohydrologic Model (MSTHM) calculates the following thermohydrologic (TH) variables: temperature, relative humidity, liquid-phase saturation, evaporation rate, air-mass fraction, gas-phase pressure, capillary pressure, and liquid- and gas-phase fluxes. The TH variables are determined as a function of position along each of the emplacement drifts in the repository and as a function of waste-package (WP) type. These variables are determined at various generic locations within the emplacement drifts, including the waste package and drip-shield surfaces and in the invert; they are also determined at various generic locations in the adjoining host rock; these variables are determined every 20 m for each emplacement drift in the repository. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow and captures the influence of the key engineering-design variables and natural-system factors affecting TH conditions in the emplacement drifts and adjoining host rock. Presented is a synopsis of recent MSTHM calculations conducted to support the Total System Performance Assessment for the License Application (TSPA-LA). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  12. 48 CFR 227.7108 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Technical Data 227.7108 Contractor data repositories. (a) Contractor data repositories may be established... procedures for protecting technical data delivered to or stored at the repository from unauthorized release... disclosure of technical data from the repository to third parties consistent with the Government's rights in...

  13. Determination of Uncertainties for +III and +IV Actinide Solubilities in the WIPP Geochemistry Model for the 2009 Compliance Recertification Application

    NASA Astrophysics Data System (ADS)

    Ismail, A. E.; Xiong, Y.; Nowak, E. J.; Brush, L. H.

    2009-12-01

    The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy (DOE) repository in southeast New Mexico for defense-related transuranic (TRU) waste. Every five years, the DOE is required to submit an application to the Environmental Protection Agency (EPA) demonstrating the WIPP’s continuing compliance with the applicable EPA regulations governing the repository. Part of this recertification effort involves a performance assessment—a probabilistic evaluation of the repository performance with respect to regulatory limits on the amount of releases from the repository to the accessible environment. One of the models used as part of the performance assessment process is a geochemistry model, which predicts solubilities of the radionuclides in the brines that may enter the repository in the different scenarios considered by the performance assessment. The dissolved actinide source term comprises actinide solubilities, which are input parameters for modeling the transport of radionuclides as a result of brine flow through and from the repository. During a performance assessment, the solubilities are modeled as the product of a “base” solubility determined from calculations based on the chemical conditions expected in the repository, and an uncertainty factor that describes the potential deviations of the model from expected behavior. We will focus here on a discussion of the uncertainties. To compute a cumulative distribution function (CDF) for the uncertainties, we compare published, experimentally measured solubility data to predictions made using the established WIPP geochemistry model. The differences between the solubilities observed for a given experiment and the calculated solubilities from the model are used to form the overall CDF, which is then sampled as part of the performance assessment. We will discuss the methodology used to update the CDF’s for the +III actinides, obtained from data for Nd, Am, and Cm, and the +IV actinides, obtained from data for Th, and present results for the calculations of the updated CDF’s. We compare the CDF’s to the distributions computed for the previous recertification, and discuss the potential impact of the changes on the geochemistry model. This research is funded by WIPP programs administered by the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  14. Modeling and Simulation Resource Repository (MSRR)(System Engineering/Integrated M&S Management Approach

    NASA Technical Reports Server (NTRS)

    Milroy, Audrey; Hale, Joe

    2006-01-01

    NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.

  15. Design and application of a data-independent precursor and product ion repository.

    PubMed

    Thalassinos, Konstantinos; Vissers, Johannes P C; Tenzer, Stefan; Levin, Yishai; Thompson, J Will; Daniel, David; Mann, Darrin; DeLong, Mark R; Moseley, M Arthur; America, Antoine H; Ottens, Andrew K; Cavey, Greg S; Efstathiou, Georgios; Scrivens, James H; Langridge, James I; Geromanos, Scott J

    2012-10-01

    The functional design and application of a data-independent LC-MS precursor and product ion repository for protein identification, quantification, and validation is conceptually described. The ion repository was constructed from the sequence search results of a broad range of discovery experiments investigating various tissue types of two closely related mammalian species. The relative high degree of similarity in protein complement, ion detection, and peptide and protein identification allows for the analysis of normalized precursor and product ion intensity values, as well as standardized retention times, creating a multidimensional/orthogonal queryable, qualitative, and quantitative space. Peptide ion map selection for identification and quantification is primarily based on replication and limited variation. The information is stored in a relational database and is used to create peptide- and protein-specific fragment ion maps that can be queried in a targeted fashion against the raw or time aligned ion detections. These queries can be conducted either individually or as groups, where the latter affords pathway and molecular machinery analysis of the protein complement. The presented results also suggest that peptide ionization and fragmentation efficiencies are highly conserved between experiments and practically independent of the analyzed biological sample when using similar instrumentation. Moreover, the data illustrate only minor variation in ionization efficiency with amino acid sequence substitutions occurring between species. Finally, the data and the presented results illustrate how LC-MS performance metrics can be extracted and utilized to ensure optimal performance of the employed analytical workflows.

  16. OncoSimulR: genetic simulation with arbitrary epistasis and mutator genes in asexual populations.

    PubMed

    Diaz-Uriarte, Ramon

    2017-06-15

    OncoSimulR implements forward-time genetic simulations of biallelic loci in asexual populations with special focus on cancer progression. Fitness can be defined as an arbitrary function of genetic interactions between multiple genes or modules of genes, including epistasis, restrictions in the order of accumulation of mutations, and order effects. Mutation rates can differ among genes, and can be affected by (anti)mutator genes. Also available are sampling from simulations (including single-cell sampling), plotting the genealogical relationships of clones and generating and plotting fitness landscapes. Implemented in R and C ++, freely available from BioConductor for Linux, Mac and Windows under the GNU GPL license. Version 2.5.9 or higher available from: http://www.bioconductor.org/packages/devel/bioc/html/OncoSimulR.html . GitHub repository at: https://github.com/rdiaz02/OncoSimul. ramon.diaz@iib.uam.es. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  17. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  18. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  19. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Swap data repository... COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of...

  20. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... geologic repository operations area. 63.112 Section 63.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical... repository operations area. The preclosure safety analysis of the geologic repository operations area must...

  1. Managing and Evaluating Digital Repositories

    ERIC Educational Resources Information Center

    Zuccala, Alesia; Oppenheim, Charles; Dhiensa, Rajveen

    2008-01-01

    Introduction: We examine the role of the digital repository manager, discuss the future of repository management and evaluation and suggest that library and information science schools develop new repository management curricula. Method: Face-to-face interviews were carried out with managers of five different types of repositories and a Web-based…

  2. COMODI: an ontology to characterise differences in versions of computational models in biology.

    PubMed

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  3. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  4. Long-term geochemical evolution of the near field repository: Insights from reactive transport modelling and experimental evidences

    NASA Astrophysics Data System (ADS)

    Arcos, David; Grandia, Fidel; Domènech, Cristina; Fernández, Ana M.; Villar, María V.; Muurinen, Arto; Carlsson, Torbjörn; Sellin, Patrik; Hernán, Pedro

    2008-12-01

    The KBS-3 underground nuclear waste repository concept designed by the Swedish Nuclear Fuel and Waste Management Co. (SKB) includes a bentonite buffer barrier surrounding the copper canisters and the iron insert where spent nuclear fuel will be placed. Bentonite is also part of the backfill material used to seal the access and deposition tunnels of the repository. The bentonite barrier has three main safety functions: to ensure the physical stability of the canister, to retard the intrusion of groundwater to the canisters, and in case of canister failure, to retard the migration of radionuclides to the geosphere. Laboratory experiments (< 10 years long) have provided evidence of the control exerted by accessory minerals and clay surfaces on the pore water chemistry. The evolution of the pore water chemistry will be a primordial factor on the long-term stability of the bentonite barrier, which is a key issue in the safety assessments of the KBS-3 concept. In this work we aim to study the long-term geochemical evolution of bentonite and its pore water in the evolving geochemical environment due to climate change. In order to do this, reactive transport simulations are used to predict the interaction between groundwater and bentonite which is simulated following two different pathways: (1) groundwater flow through the backfill in the deposition tunnels, eventually reaching the top of the deposition hole, and (2) direct connection between groundwater and bentonite rings through fractures in the granite crosscutting the deposition hole. The influence of changes in climate has been tested using three different waters interacting with the bentonite: present-day groundwater, water derived from ice melting, and deep-seated brine. Two commercial bentonites have been considered as buffer material, MX-80 and Deponit CA-N, and one natural clay (Friedland type) for the backfill. They show differences in the composition of the exchangeable cations and in the accessory mineral content. Results from the simulations indicate that pore water chemistry is controlled by the equilibrium with the accessory minerals, especially carbonates. pH is buffered by precipitation/dissolution of calcite and dolomite, when present. The equilibrium of these minerals is deeply influenced by gypsum dissolution and cation exchange reactions in the smectite interlayer. If carbonate minerals are initially absent in bentonite, pH is then controlled by surface acidity reactions in the hydroxyl groups at the edge sites of the clay fraction, although its buffering capacity is not as strong as the equilibrium with carbonate minerals. The redox capacity of the bentonite pore water system is mainly controlled by Fe(II)-bearing minerals (pyrite and siderite). Changes in the groundwater composition lead to variations in the cation exchange occupancy, and dissolution-precipitation of carbonate minerals and gypsum. The most significant changes in the evolution of the system are predicted when ice-melting water, which is highly diluted and alkaline, enters into the system. In this case, the dissolution of carbonate minerals is enhanced, increasing pH in the bentonite pore water. Moreover, a rapid change in the population of exchange sites in the smectite is expected due to the replacement of Na for Ca.

  5. Long-term geochemical evolution of the near field repository: insights from reactive transport modelling and experimental evidences.

    PubMed

    Arcos, David; Grandia, Fidel; Domènech, Cristina; Fernández, Ana M; Villar, María V; Muurinen, Arto; Carlsson, Torbjörn; Sellin, Patrik; Hernán, Pedro

    2008-12-12

    The KBS-3 underground nuclear waste repository concept designed by the Swedish Nuclear Fuel and Waste Management Co. (SKB) includes a bentonite buffer barrier surrounding the copper canisters and the iron insert where spent nuclear fuel will be placed. Bentonite is also part of the backfill material used to seal the access and deposition tunnels of the repository. The bentonite barrier has three main safety functions: to ensure the physical stability of the canister, to retard the intrusion of groundwater to the canisters, and in case of canister failure, to retard the migration of radionuclides to the geosphere. Laboratory experiments (< 10 years long) have provided evidence of the control exerted by accessory minerals and clay surfaces on the pore water chemistry. The evolution of the pore water chemistry will be a primordial factor on the long-term stability of the bentonite barrier, which is a key issue in the safety assessments of the KBS-3 concept. In this work we aim to study the long-term geochemical evolution of bentonite and its pore water in the evolving geochemical environment due to climate change. In order to do this, reactive transport simulations are used to predict the interaction between groundwater and bentonite which is simulated following two different pathways: (1) groundwater flow through the backfill in the deposition tunnels, eventually reaching the top of the deposition hole, and (2) direct connection between groundwater and bentonite rings through fractures in the granite crosscutting the deposition hole. The influence of changes in climate has been tested using three different waters interacting with the bentonite: present-day groundwater, water derived from ice melting, and deep-seated brine. Two commercial bentonites have been considered as buffer material, MX-80 and Deponit CA-N, and one natural clay (Friedland type) for the backfill. They show differences in the composition of the exchangeable cations and in the accessory mineral content. Results from the simulations indicate that pore water chemistry is controlled by the equilibrium with the accessory minerals, especially carbonates. pH is buffered by precipitation/dissolution of calcite and dolomite, when present. The equilibrium of these minerals is deeply influenced by gypsum dissolution and cation exchange reactions in the smectite interlayer. If carbonate minerals are initially absent in bentonite, pH is then controlled by surface acidity reactions in the hydroxyl groups at the edge sites of the clay fraction, although its buffering capacity is not as strong as the equilibrium with carbonate minerals. The redox capacity of the bentonite pore water system is mainly controlled by Fe(II)-bearing minerals (pyrite and siderite). Changes in the groundwater composition lead to variations in the cation exchange occupancy, and dissolution-precipitation of carbonate minerals and gypsum. The most significant changes in the evolution of the system are predicted when ice-melting water, which is highly diluted and alkaline, enters into the system. In this case, the dissolution of carbonate minerals is enhanced, increasing pH in the bentonite pore water. Moreover, a rapid change in the population of exchange sites in the smectite is expected due to the replacement of Na for Ca.

  6. Virtual patient repositories--a comparative analysis.

    PubMed

    Küfner, Julia; Kononowicz, Andrzej A; Hege, Inga

    2014-01-01

    Virtual Patients (VPs) are an important component of medical education. One way to reduce the costs for creating VPs is sharing through repositories. We conducted a literature review to identify existing repositories and analyzed the 17 included repositories in regards to the search functions and metadata they provide. Most repositories provided some metadata such as title or description, whereas other data, such as educational objectives, were less frequent. Future research could, in cooperation with the repository provider, investigate user expectations and usage patterns.

  7. The Environmental Data Initiative data repository: Trustworthy practices that foster preservation, fitness, and reuse for environmental and ecological data

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    The Environmental Data Initiative (EDI) is an outgrowth of more than 30 years of information management experience and technology from LTER Network data practitioners. EDI builds upon the PASTA data repository software used by the LTER Network Information System and manages more than 42,000 data packages, containing tabular data, imagery, and other formats. Development of the repository was a community process beginning in 2009 that included numerous working groups for generating use cases, system requirements, and testing of completed software, thereby creating a vested interested in its success and transparency in design. All software is available for review on GitHub, and refinements and new features are ongoing. Documentation is also available on Read-the-docs, including a comprehensive description of all web-service API methods. PASTA is metadata driven and uses the Ecological Metadata Language (EML) standard for describing environmental and ecological data; a simplified Dublin Core document is also available for each data package. Data are aggregated into packages consisting of metadata and other related content described by an OAI-ORE document. Once archived, each data package becomes immutable and permanent; updates are possible through the addition of new revisions. Components of each data package are accessible through a unique identifier, while the entire data package receives a DOI that is registered in DataCite. Preservation occurs through a combination of DataONE synchronization/replication and by a series of local and remote backup strategies, including daily uploads to AWS Glacier storage. Checksums are computed for all data at initial upload, with random verification occurring on a continuous basis, thus ensuring the integrity of data. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML before data are archived; data packages that fail any test are forbidden in the repository. These tests are a measure data fitness, which ultimately increases confidence in data reuse and synthesis. The EDI data repository is recognized by multiple organizations, including EarthCube's Council of Data Facilities, the United States Geological Survey, FAIRsharing.org, re3data.org, and is a PLOS and Nature recommended data repository.

  8. The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories

    NASA Astrophysics Data System (ADS)

    Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.

    2017-12-01

    SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R

  9. Radiation induced dissolution of UO 2 based nuclear fuel - A critical review of predictive modelling approaches

    NASA Astrophysics Data System (ADS)

    Eriksen, Trygve E.; Shoesmith, David W.; Jonsson, Mats

    2012-01-01

    Radiation induced dissolution of uranium dioxide (UO 2) nuclear fuel and the consequent release of radionuclides to intruding groundwater are key-processes in the safety analysis of future deep geological repositories for spent nuclear fuel. For several decades, these processes have been studied experimentally using both spent fuel and various types of simulated spent fuels. The latter have been employed since it is difficult to draw mechanistic conclusions from real spent nuclear fuel experiments. Several predictive modelling approaches have been developed over the last two decades. These models are largely based on experimental observations. In this work we have performed a critical review of the modelling approaches developed based on the large body of chemical and electrochemical experimental data. The main conclusions are: (1) the use of measured interfacial rate constants give results in generally good agreement with experimental results compared to simulations where homogeneous rate constants are used; (2) the use of spatial dose rate distributions is particularly important when simulating the behaviour over short time periods; and (3) the steady-state approach (the rate of oxidant consumption is equal to the rate of oxidant production) provides a simple but fairly accurate alternative, but errors in the reaction mechanism and in the kinetic parameters used may not be revealed by simple benchmarking. It is essential to use experimentally determined rate constants and verified reaction mechanisms, irrespective of whether the approach is chemical or electrochemical.

  10. Learning Object Repositories

    ERIC Educational Resources Information Center

    Lehman, Rosemary

    2007-01-01

    This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)

  11. The Amistad Research Center: Documenting the African American Experience.

    ERIC Educational Resources Information Center

    Chepesiuk, Ron

    1993-01-01

    Describes the Amistad Research Center housed at Tulane University which is a repository of primary documents on African-American history. Topics addressed include the development and growth of the collection; inclusion of the American Missionary Association archives; sources of support; civil rights; and collecting for the future. (LRW)

  12. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  13. DSSTox chemical-index files for exposure-related experiments in ArrayExpress and Gene Expression Omnibus: enabling toxico-chemogenomics data linkages

    EPA Science Inventory

    The Distributed Structure-Searchable Toxicity (DSSTox) ARYEXP and GEOGSE files are newly published, structure-annotated files of the chemical-associated and chemical exposure-related summary experimental content contained in the ArrayExpress Repository and Gene Expression Omnibus...

  14. Comparison of neptunium sorption results using batch and column techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triay, I.R.; Furlano, A.C.; Weaver, S.C.

    1996-08-01

    We used crushed-rock columns to study the sorption retardation of neptunium by zeolitic, devitrified, and vitric tuffs typical of those at the site of the potential high-level nuclear waste repository at Yucca Mountain, Nevada. We used two sodium bicarbonate waters (groundwater from Well J-13 at the site and water prepared to simulate groundwater from Well UE-25p No. 1) under oxidizing conditions. It was found that values of the sorption distribution coefficient, Kd, obtained from these column experiments under flowing conditions, regardless of the water or the water velocity used, agreed well with those obtained earlier from batch sorption experiments undermore » static conditions. The batch sorption distribution coefficient can be used to predict the arrival time for neptunium eluted through the columns. On the other hand, the elution curves showed dispersivity, which implies that neptunium sorption in these tuffs may be nonlinear, irreversible, or noninstantaneous. As a result, use of a batch sorption distribution coefficient to calculate neptunium transport through Yucca Mountain tuffs would yield conservative values for neptunium release from the site. We also noted that neptunium (present as the anionic neptunyl carbonate complex) never eluted prior to tritiated water, which implies that charge exclusion does not appear to exclude neptunium from the tuff pores. The column experiments corroborated the trends observed in batch sorption experiments: neptunium sorption onto devitrified and vitric tuffs is minimal and sorption onto zeolitic tuffs decreases as the amount of sodium and bicarbonate/carbonate in the water increases.« less

  15. Ion-Exchange Interdiffusion Model with Potential Application to Long-Term Nuclear Waste Glass Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neeway, James Joseph; Kerisit, Sebastien N.; Liu, Jia

    2016-05-05

    Abstract: Ion exchange is an integral mechanism influencing the corrosion of glasses. Due to the formation of alteration layers in aqueous conditions, it is difficult to conclusively deconvolute the process of ion exchange from other processes, principally dissolution of the glass matrix. Therefore, we have developed a method to isolate alkali diffusion that involves contacting glass coupons with a solution of 6LiCl dissolved in functionally inert dimethyl sulfoxide. We employ the method at temperatures ranging from 25 to 150 °C with various glass formulations. Glass compositions include simulant nuclear waste glasses, such as SON68 and the international simple glass (ISG),more » glasses in which the nature of the alkali element was varied, and glasses that contained more than one alkali element. An interdiffusion model based on Fick’s second law was developed and applied to all experiments to extract diffusion coefficients. The model expands established models of interdiffusion to the case where multiple types of alkali sites are present in the glass. Activation energies for alkali ion exchange were calculated and the results are in agreement with those obtained in glass strengthening experiments but are nearly five times higher than values reported for diffusion-controlled processes in nuclear waste glass corrosion experiments. A discussion of the root causes for this apparent discrepancy is provided. The interdiffusion model derived from laboratory experiments is expected to be useful for modeling glass corrosion in a geological repository when the silicon concentration is high.« less

  16. Semantic Repositories for eGovernment Initiatives: Integrating Knowledge and Services

    NASA Astrophysics Data System (ADS)

    Palmonari, Matteo; Viscusi, Gianluigi

    In recent years, public sector investments in eGovernment initiatives have depended on making more reliable existing governmental ICT systems and infrastructures. Furthermore, we assist at a change in the focus of public sector management, from the disaggregation, competition and performance measurements typical of the New Public Management (NPM), to new models of governance, aiming for the reintegration of services under a new perspective in bureaucracy, namely a holistic approach to policy making which exploits the extensive digitalization of administrative operations. In this scenario, major challenges are related to support effective access to information both at the front-end level, by means of highly modular and customizable content provision, and at the back-end level, by means of information integration initiatives. Repositories of information about data and services that exploit semantic models and technologies can support these goals by bridging the gap between the data-level representations and the human-level knowledge involved in accessing information and in searching for services. Moreover, semantic repository technologies can reach a new level of automation for different tasks involved in interoperability programs, both related to data integration techniques and service-oriented computing approaches. In this chapter, we discuss the above topics by referring to techniques and experiences where repositories based on conceptual models and ontologies are used at different levels in eGovernment initiatives: at the back-end level to produce a comprehensive view of the information managed in the public administrations' (PA) information systems, and at the front-end level to support effective service delivery.

  17. Transportation plan repository and archive.

    DOT National Transportation Integrated Search

    2011-04-01

    This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...

  18. Supporting multiple domains in a single reuse repository

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    Domain analysis typically results in the construction of a domain-specific repository. Such a repository imposes artificial boundaries on the sharing of similar assets between related domains. A lattice-based approach to repository modeling can preserve a reuser's domain specific view of the repository, while avoiding replication of commonly used assets and supporting a more general perspective on domain interrelationships.

  19. NCI Mouse Repository | FNLCR Staging

    Cancer.gov

    The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains

  20. Young Scientist in Classroom

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    Bringing space exploration recent results and future challenges and opportunities to the knowledge of students has been a preoccupation of educators and space agencies for quite some time. The will to foster student’s interest and reawaken their interest for science topics and in particular research is something occupying the minds of educators in all corners of the globe. But the challenge is growing literally at the speed of light. We are in the age of “Big Data”. Information is available, opportunities to build smart algorithms flourishing. The problem at hand is how we are going to make use of all this possibilities. How can we prepare students to the challenges already upon them? How can we create a scientifically literate and conscious new generation? They are the future of mankind and therefore this is a priority and should quickly be recognized as such. Empowering teachers for this challenge is the key to face the challenges and hold the opportunities. Teachers and students need to learn how to establish fruitful collaboration in the pursuit of meaningful teaching and learning experiences. Teachers need to embrace the opportunities this ICT world is offering and accompany student’s path as tutors and not as explorers themselves. In this training session we intend to explore tools and repositories that bring real cutting edge science to the hands of educators and their students. A full space exploration will be revealed. Planetarium Software - Some tools tailored to prepare an observing session or to explore space mission’s results will be presented in this topic. Participants will also have the opportunity to learn how to plan an observing session. This reveals to be an excellent tool to teach about celestial movements and give students a sense of what it means to explore for instance the Solar System. Robotic Telescopes and Radio Antennas - Having planned an observing session the participants will be introduced to the use of robotic telescopes, a very powerful tool that allows educators to address a diversity of topics ranging from ICT tools to the Exploration of our Universe. Instead of using traditional methods to teach about certain subjects for instance: stellar spectra, extra-solar planets or the classification of galaxies, they can use these powerful tools. Among other advantages a clear benefit of such tool is that teachers can use telescopes during regular classroom hours, provided they choose one located in the opposite part of the planet, where it is night time. Participants will also have the opportunity to use one of the radio antennas devoted for education from the EUHOU Consortium (European Hands-on Universe). A map of the arms of our galaxy will be built during the training session. Image Processing - After acquiring the images participants will be introduced to Salsa J, an image processing software that allows educators to explore the potential of astronomical images. The first example will be a simple measurement task: measuring craters on the Moon. Further exploration will guide them from luminosity studies to the construction of colour images, from making movies exhibiting the circular motion of the Sun to Jupiter Moons dance around the planet. e-learning repositories - In the ICT age it is very important that educators have support and know where to find meaningful and curriculum adapted resources for the construction of modern lessons. Some repositories will be presented in this session. Examples of such repositories are: Discover the Cosmos and EUHOU or a congregator of such repositories with quite advanced possibilities to support the work of teachers, the Open Discovery Space portal. This type of sessions are being successfully implemented by the Galileo Teacher Training Program team in Portugal under the scope of the EC funded GO-LAB project. This is a project devoted to demonstrate innovative ways to involve teachers and students in e-Science through the use of virtual labs, that simulate experiments, in order to spark young people’s interest in science and in following scientific careers.

  1. A Climate Statistics Tool and Data Repository

    NASA Astrophysics Data System (ADS)

    Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.

    2017-12-01

    Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.

  2. The site-scale saturated zone flow model for Yucca Mountain: Calibration of different conceptual models and their impact on flow paths

    USGS Publications Warehouse

    Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.

    2003-01-01

    This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.

  3. HSE12 implementation in libxc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moussa, Jonathan E.

    2013-05-13

    This piece of software is a new feature implemented inside an existing open-source library. Specifically, it is a new implementation of a density functional (HSE, short for Heyd-Scuseria-Ernzerhof) for a repository of density functionals, the libxc library. It fixes some numerical problems with existing implementations, as outlined in a scientific paper recently submitted for publication. Density functionals are components of electronic structure simulations, which model properties of electrons inside molecules and crystals.

  4. Continuum-based DFN-consistent simulations of oxygen ingress in fractured crystalline rocks

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Puigdomenech, I.; Molinero, J.; Ebrahimi, H.; Gylling, B.; Svensson, U.; Bosbach, D.; Deissmann, G.

    2016-12-01

    The potential transient infiltration of oxygenated glacial meltwater into initially anoxic and reducing fractured crystalline rocks during glaciation events is an issue of concern for some of the prospected deep geological repositories for spent nuclear fuel. Here, this problem is assessed using reactive transport calculations. First, a novel parameterisation procedure is presented, where flow, transport and geochemical parameters (i.e. hydraulic conductivity, effective/kinetic porosity, and mineral specific surface and abundance) are defined on a finite volume numerical grid based on the (spatially varying) properties of an underlying Discrete Fracture Network (DFN). Second, using this approach, a realistic reactive transport model of Forsmark, i.e. the selected site for the proposed Swedish spent nuclear fuel repository, is implemented. The model consists of more than 70 million geochemical transport degrees of freedom and simulates the ingress of oxygen-rich water from the recharge area of the domain and its depletion due to reactions with the Fe(II) mineral chlorite. Third, the calculations are solved in the supercomputer JUQUEEN of the Jülich Supercomputing Centre. The results of the simulations show that oxygen infiltrates relatively quickly along fractures and deformation zones until a steady state profile is reached, where geochemical reactions counterbalance advective transport processes. Interestingly, most of the iron-bearing minerals are consumed in the highly conductive zones, where larger mineral surfaces are available for reactions. An analysis based on mineral mass balance shows that the considered rock medium has enough capacity to buffer oxygen infiltration for a long period of time (i.e. some thousand years).

  5. Coupling fuel cycles with repositories: how repository institutional choices may impact fuel cycle design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.; Miller, W.F.

    2013-07-01

    The historical repository siting strategy in the United States has been a top-down approach driven by federal government decision making but it has been a failure. This policy has led to dispatching fuel cycle facilities in different states. The U.S. government is now considering an alternative repository siting strategy based on voluntary agreements with state governments. If that occurs, state governments become key decision makers. They have different priorities. Those priorities may change the characteristics of the repository and the fuel cycle. State government priorities, when considering hosting a repository, are safety, financial incentives and jobs. It follows that statesmore » will demand that a repository be the center of the back end of the fuel cycle as a condition of hosting it. For example, states will push for collocation of transportation services, safeguards training, and navy/private SNF (Spent Nuclear Fuel) inspection at the repository site. Such activities would more than double local employment relative to what was planned for the Yucca Mountain-type repository. States may demand (1) the right to take future title of the SNF so if recycle became economic the reprocessing plant would be built at the repository site and (2) the right of a certain fraction of the repository capacity for foreign SNF. That would open the future option of leasing of fuel to foreign utilities with disposal of the SNF in the repository but with the state-government condition that the front-end fuel-cycle enrichment and fuel fabrication facilities be located in that state.« less

  6. NCI Mouse Repository | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The NCI Mouse Repository is an NCI-funded resource for mouse cancer models and associated strains. The repository makes strains available to all members of the scientific community (academic, non-profit, and commercial). NCI Mouse Repository strains

  7. CellBase, a comprehensive collection of RESTful web services for retrieving relevant biological information from heterogeneous sources.

    PubMed

    Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio

    2012-07-01

    During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.

  8. The Cambridge Centre for Ageing and Neuroscience (Cam-CAN) data repository: Structural and functional MRI, MEG, and cognitive data from a cross-sectional adult lifespan sample.

    PubMed

    Taylor, Jason R; Williams, Nitin; Cusack, Rhodri; Auer, Tibor; Shafto, Meredith A; Dixon, Marie; Tyler, Lorraine K; Cam-Can; Henson, Richard N

    2017-01-01

    This paper describes the data repository for the Cambridge Centre for Ageing and Neuroscience (Cam-CAN) initial study cohort. The Cam-CAN Stage 2 repository contains multi-modal (MRI, MEG, and cognitive-behavioural) data from a large (approximately N=700), cross-sectional adult lifespan (18-87years old) population-based sample. The study is designed to characterise age-related changes in cognition and brain structure and function, and to uncover the neurocognitive mechanisms that support healthy cognitive ageing. The database contains raw and preprocessed structural MRI, functional MRI (active tasks and resting state), and MEG data (active tasks and resting state), as well as derived scores from cognitive behavioural experiments spanning five broad domains (attention, emotion, action, language, and memory), and demographic and neuropsychological data. The dataset thus provides a depth of neurocognitive phenotyping that is currently unparalleled, enabling integrative analyses of age-related changes in brain structure, brain function, and cognition, and providing a testbed for novel analyses of multi-modal neuroimaging data. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. An Intelligent Cloud Storage Gateway for Medical Imaging.

    PubMed

    Viana-Ferreira, Carlos; Guerra, António; Silva, João F; Matos, Sérgio; Costa, Carlos

    2017-09-01

    Historically, medical imaging repositories have been supported by indoor infrastructures. However, the amount of diagnostic imaging procedures has continuously increased over the last decades, imposing several challenges associated with the storage volume, data redundancy and availability. Cloud platforms are focused on delivering hardware and software services over the Internet, becoming an appealing solution for repository outsourcing. Although this option may bring financial and technological benefits, it also presents new challenges. In medical imaging scenarios, communication latency is a critical issue that still hinders the adoption of this paradigm. This paper proposes an intelligent Cloud storage gateway that optimizes data access times. This is achieved through a new cache architecture that combines static rules and pattern recognition for eviction and prefetching. The evaluation results, obtained from experiments over a real-world dataset, show that cache hit ratios can reach around 80%, leading to reductions of image retrieval times by over 60%. The combined use of eviction and prefetching policies proposed can significantly reduce communication latency, even when using a small cache in comparison to the total size of the repository. Apart from the performance gains, the proposed system is capable of adjusting to specific workflows of different institutions.

  10. End of FY10 report - used fuel disposition technical bases and lessons learned : legal and regulatory framework for high-level waste disposition in the United States.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiner, Ruth F.; Blink, James A.; Rechard, Robert Paul

    This report examines the current policy, legal, and regulatory framework pertaining to used nuclear fuel and high level waste management in the United States. The goal is to identify potential changes that if made could add flexibility and possibly improve the chances of successfully implementing technical aspects of a nuclear waste policy. Experience suggests that the regulatory framework should be established prior to initiating future repository development. Concerning specifics of the regulatory framework, reasonable expectation as the standard of proof was successfully implemented and could be retained in the future; yet, the current classification system for radioactive waste, including hazardousmore » constituents, warrants reexamination. Whether or not consideration of multiple sites are considered simultaneously in the future, inclusion of mechanisms such as deliberate use of performance assessment to manage site characterization would be wise. Because of experience gained here and abroad, diversity of geologic media is not particularly necessary as a criterion in site selection guidelines for multiple sites. Stepwise development of the repository program that includes flexibility also warrants serious consideration. Furthermore, integration of the waste management system from storage, transportation, and disposition, should be examined and would be facilitated by integration of the legal and regulatory framework. Finally, in order to enhance acceptability of future repository development, the national policy should be cognizant of those policy and technical attributes that enhance initial acceptance, and those policy and technical attributes that maintain and broaden credibility.« less

  11. Warehousing re-annotated cancer genes for biomarker meta-analysis.

    PubMed

    Orsini, M; Travaglione, A; Capobianco, E

    2013-07-01

    Translational research in cancer genomics assigns a fundamental role to bioinformatics in support of candidate gene prioritization with regard to both biomarker discovery and target identification for drug development. Efforts in both such directions rely on the existence and constant update of large repositories of gene expression data and omics records obtained from a variety of experiments. Users who interactively interrogate such repositories may have problems in retrieving sample fields that present limited associated information, due for instance to incomplete entries or sometimes unusable files. Cancer-specific data sources present similar problems. Given that source integration usually improves data quality, one of the objectives is keeping the computational complexity sufficiently low to allow an optimal assimilation and mining of all the information. In particular, the scope of integrating intraomics data can be to improve the exploration of gene co-expression landscapes, while the scope of integrating interomics sources can be that of establishing genotype-phenotype associations. Both integrations are relevant to cancer biomarker meta-analysis, as the proposed study demonstrates. Our approach is based on re-annotating cancer-specific data available at the EBI's ArrayExpress repository and building a data warehouse aimed to biomarker discovery and validation studies. Cancer genes are organized by tissue with biomedical and clinical evidences combined to increase reproducibility and consistency of results. For better comparative evaluation, multiple queries have been designed to efficiently address all types of experiments and platforms, and allow for retrieval of sample-related information, such as cell line, disease state and clinical aspects. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Data repositories for medical education research: issues and recommendations.

    PubMed

    Schwartz, Alan; Pappas, Cleo; Sandlow, Leslie J

    2010-05-01

    The authors explore issues surrounding digital repositories with the twofold intention of clarifying their creation, structure, content, and use, and considering the implementation of a global digital repository for medical education research data sets-an online site where medical education researchers would be encouraged to deposit their data in order to facilitate the reuse and reanalysis of the data by other researchers. By motivating data sharing and reuse, investigators, medical schools, and other stakeholders might see substantial benefits to their own endeavors and to the progress of the field of medical education.The authors review digital repositories in medicine, social sciences, and education, describe the contents and scope of repositories, and present extant examples. The authors describe the potential benefits of a medical education data repository and report results of a survey of the Society for Directors of Research in Medicine Education, in which participants responded to questions about data sharing and a potential data repository. Respondents strongly endorsed data sharing, with the caveat that principal investigators should choose whether or not to share data they collect. A large majority believed that a repository would benefit their unit and the field of medical education. Few reported using existing repositories. Finally, the authors consider challenges to the establishment of such a repository, including taxonomic organization, intellectual property concerns, human subjects protection, technological infrastructure, and evaluation standards. The authors conclude with recommendations for how a medical education data repository could be successfully developed.

  13. Building Corpus-Informed Word Lists for L2 Vocabulary Learning in Nine Languages

    ERIC Educational Resources Information Center

    Charalabopoulou, Frieda; Gavrilidou, Maria; Kokkinakis, Sofie Johansson; Volodina, Elena

    2012-01-01

    Lexical competence constitutes a crucial aspect in L2 learning, since building a rich repository of words is considered indispensable for successful communication. CALL practitioners have experimented with various kinds of computer-mediated glosses to facilitate L2 vocabulary building in the context of incidental vocabulary learning. Intentional…

  14. Community based research for an urban recreation application of benefits-based management

    Treesearch

    William T. Borrie; Joseph W. Roggenbuck

    1995-01-01

    Benefits-based management is an approach to park and recreation management that focuses on the positive outcomes of engaging in recreational experiences. Because one class of possible benefits accrue to the community, a philosophical framework is discussed suggesting that communities are themselves the primary sources, generators, and repositories of knowledge....

  15. Electrochemical behavior of Alloy 22 and friction type rock bolt

    NASA Astrophysics Data System (ADS)

    Rahman, Md Sazzadur

    Alloy 22 (Ni-22Cr-13Mo-3Fe-3W) is a candidate alloy for the outer shell of spent nuclear materials storage containers in the Yucca Mountain High Level Nuclear Waste Repository because of its excellent corrosion resistance. The nuclear waste container is cylindrical in shape and the end caps are welded. Typically, Alloy 22 retains the high temperature single phase cubic structure near room temperature, but topologically close packed (TCP) phases such as mu, P, sigma etc. and Cr rich carbides can form during thermal aging and welding. Rock bolts that are used for reinforcing subsurface tunnels are generally made of carbon or low alloy steels; these are being used in the nuclear repository tunnel. The corrosion behavior of these rock bolts have not been systematically evaluated under the environmental conditions of the repository. The ground waters at the Yucca Mountain (YM) repository permeate through the pores of the rock mass, and have propensity to corrode the rock bolts and waste package container. The environmental (aerated and deaerated) conditions influence the rate of corrosion in these material; these have not been systematically evaluated yet under the repository environment. In this study, the corrosion behavior of Alloy 22 and a friction type rock bolts was investigated as a function of temperature and concentration in complex multi-ionic electrolytes. Simulated electrolyte of YM ground water found in the repository environment was made in different concentrations (1X, 10X, and 100X). The interaction of simulated electrolytes in aerated and deaerated condition with Alloy 22 and low alloy steel of friction type rock bolt (split tapered cylinder type commercial design) has been investigated. Polarization resistance method was used to measure the corrosion rates. We found that the corrosion rate of Alloy 22 was higher in the deaerated electrolyte as compared to the aerated. The presence of oxygen in the electrolyte during aeration is conducive to formation of passive films that inhibits the corrosion process. The temperature dependency of the corrosion rate was affected by aeration and deaeration of the electrolytes. Another study related to corrosion behavior of weld Alloy 22 was undertaken to understand electrochemical behavior of welded structures. Corrosion studies were carried out in more aggressive electrolyte (0.1M HCl at 66°C) after solution annealing at 1121°C for 1 hr. In the as-welded structure a dendritic microstructure was observed in the weld region. However, after solution annealing these dendrites are not observed; suggesting homogenization of the grains. Three different specimens were made out from a welded Alloy 22 plates with large welded surface; weld interface, half weld and base metal away from the weld and heat affected zone, and corrosion rates of all these samples were measured. The results showed that the corrosion resistance of the solution annealed was higher in all three specimens than those of as-welded specimens. Corrosion rates of friction type set rock bolts (split set) were measured at 25°C, 45°C, 65°C and 90°C using 1X, 10X and 100X concentration of electrolyte both in aerated and deaerated conditions. The corrosion rates of rock bolts in 1X and 10X electrolyte showed ranged from ˜30 to 200mum/yr for deaerated and 150 to 1600 mum/yr for aerated. In summary, we have investigated the electrochemical behavior of the Alloy 22 and steels that have significance to the YM nuclear repository. The effects of temperature, type of electrolyte, condition of the alloys on the corrosion rates are reported.

  16. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... repositories. 227.7207 Section 227.7207 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to...

  17. 75 FR 70310 - Sunshine Act Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ... Consumer Protection Act governing the security-based swap data repository registration process, the duties of such repositories, and the core principles applicable to such repositories. 4. The Commission will... security-based swap data repositories or the Commission and the public dissemination of security-based swap...

  18. Integration and Cooperation in the Next Golden Age of Human Space Flight Data Repositories: Tools for Retrospective Analysis and Future Planning

    NASA Technical Reports Server (NTRS)

    Thomas, D.; Fitts, M.; Wear, M.; VanBaalen, M.

    2011-01-01

    As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health Repository (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future research studies.

  19. Core Certification of Data Repositories: Trustworthiness and Long-Term Stewardship

    NASA Astrophysics Data System (ADS)

    de Sherbinin, A. M.; Mokrane, M.; Hugo, W.; Sorvari, S.; Harrison, S.

    2017-12-01

    Scientific integrity and norms dictate that data created and used by scientists should be managed, curated, and archived in trustworthy data repositories thus ensuring that science is verifiable and reproducible while preserving the initial investment in collecting data. Research stakeholders including researchers, science funders, librarians, and publishers must also be able to establish the trustworthiness of data repositories they use to confirm that the data they submit and use remain useful and meaningful in the long term. Data repositories are increasingly recognized as a key element of the global research infrastructure and the importance of establishing their trustworthiness is recognised as a prerequisite for efficient scientific research and data sharing. The Core Trustworthy Data Repository Requirements are a set of universal requirements for certification of data repositories at the core level (see: https://goo.gl/PYsygW). They were developed by the ICSU World Data System (WDS: www.icsu-wds.org) and the Data Seal of Approval (DSA: www.datasealofapproval.org)—the two authoritative organizations responsible for the development and implementation of this standard to be further developed under the CoreTrustSeal branding . CoreTrustSeal certification of data repositories involves a minimally intensive process whereby repositories supply evidence that they are sustainable and trustworthy. Repositories conduct a self-assessment which is then reviewed by community peers. Based on this review CoreTrustSeal certification is granted by the CoreTrustSeal Standards and Certification Board. Certification helps data communities—producers, repositories, and consumers—to improve the quality and transparency of their processes, and to increase awareness of and compliance with established standards. This presentation will introduce the CoreTrustSeal certification requirements for repositories and offer an opportunity to discuss ways to improve the contribution of certified data repositories to sustain open data for open scientific research.

  20. Carbon sequestration via reaction with basaltic rocks: geochemical modeling and experimental results

    USGS Publications Warehouse

    Rosenbauer, Robert J.; Thomas, Burt; Bischoff, James L.; Palandri, James

    2012-01-01

    Basaltic rocks are potential repositories for sequestering carbon dioxide (CO2) because of their capacity for trapping CO2 in carbonate minerals. We carried out a series of thermodynamic equilibrium models and high pressure experiments, reacting basalt with CO2-charged fluids over a range of conditions from 50 to 200 °C at 300 bar. Results indicate basalt has a high reactivity to CO2 acidified brine. Carbon dioxide is taken up from solution at all temperatures from 50 to 200 °C, 300 bar, but the maximum extent and rate of reaction occurs at 100 °C, 300 bar. Reaction path simulations utilizing the geochemical modeling program CHILLER predicted an equilibrium carbonate alteration assemblage of calcite, magnesite, and siderite, but the only secondary carbonate identified in the experiments was a ferroan magnesite. The amount of uptake at 100 °C, 300 bar ranged from 8% by weight for a typical tholeite to 26% for a picrite. The actual amount of CO2 uptake and extent of rock alteration coincides directly with the magnesium content of the rock suggesting that overall reaction extent is controlled by bulk basalt Mg content. In terms of sequestering CO2, an average basaltic MgO content of 8% is equivalent to 2.6 × 108 metric ton CO2/km3 basalt.

  1. TomoBank: a tomographic data repository for computational x-ray science

    DOE PAGES

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; ...

    2018-02-08

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less

  2. Thermohydrological conditions and silica redistribution near high-level nuclear wastes emplaced in saturated geological formations

    NASA Astrophysics Data System (ADS)

    Verma, A.; Pruess, K.

    1988-02-01

    Evaluation of the thermohydrological conditions near high-level nuclear waste packages is needed for the design of the waste canister and for overall repository design and performance assessment. Most available studies in this area have assumed that the hydrologic properties of the host rock are not changed in response to the thermal, mechanical, or chemical effects caused by waste emplacement. However, the ramifications of this simplifying assumption have not been substantiated. We have studied dissolution and precipitation of silica in liquid-saturated hydrothermal flow systems, including changes in formation porosity and permeability. Using numerical simulation, we compare predictions of thermohydrological conditions with and without inclusion of silica redistribution effects. Two cases were studied, namely, a canister-scale problem, and a repository-wide thermal convection problem and different pore models were employed for the permeable medium (fractures with uniform or nonuniform cross sections). We find that silica redistribution in water-saturated conditions does not have a sizeable effect on host rock and canister temperatures, pore pressures, or flow velocities.

  3. The preliminary design and feasibility study of the spent fuel and high level waste repository in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valvoda, Z.; Holub, J.; Kucerka, M.

    1996-12-31

    In the year 1993, began the Program of Development of the Spent Fuel and High Level Waste Repository in the Conditions of the Czech Republic. During the first phase, the basic concept and structure of the Program has been developed, and the basic design criteria and requirements were prepared. In the conditions of the Czech Republic, only an underground repository in deep geological formation is acceptable. Expected depth is between 500 to 1000 meters and as host rock will be granites. A preliminary variant design study was realized in 1994, that analyzed the radioactive waste and spent fuel flow frommore » NPPs to the repository, various possibilities of transportation in accordance to the various concepts of spent fuel conditioning and transportation to the underground structures. Conditioning and encapsulation of spent fuel and/or radioactive waste is proposed on the repository site. Underground disposal structures are proposed at one underground floor. The repository will have reserve capacity for radioactive waste from NPPs decommissioning and for waste non acceptable to other repositories. Vertical disposal of unshielded canisters in boreholes and/or horizontal disposal of shielded canisters is studied. As the base term of the start up of the repository operation, the year 2035 has been established. From this date, a preliminary time schedule of the Project has been developed. A method of calculating leveled and discounted costs within the repository lifetime, for each of selected 5 variants, was used for economic calculations. Preliminary expected parametric costs of the repository are about 0,1 Kc ($0.004) per MWh, produced in the Czech NPPs. In 1995, the design and feasibility study has gone in more details to the technical concept of repository construction and proposed technologies, as well as to the operational phase of the repository. Paper will describe results of the 1995 design work and will present the program of the repository development in next period.« less

  4. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... data repository's policies and procedures reasonably designed to protect the privacy of any and all... swap data repositories. 49.26 Section 49.26 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data...

  5. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    PubMed

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Making research data repositories visible: the re3data.org Registry.

    PubMed

    Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe

    2013-01-01

    Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org-Registry of Research Data Repositories-has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data.

  7. Numerical modeling of flow and transport in the far-field of a generic nuclear waste repository in fractured crystalline rock using updated fracture continuum model

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Kalinina, E.; Klise, K. A.; Wang, Y.

    2016-12-01

    Disposal of high-level radioactive waste in a deep geological repository in crystalline host rock is one of the potential options for long term isolation. Characterization of the natural barrier system is an important component of the disposal option. In this study we present numerical modeling of flow and transport in fractured crystalline rock using an updated fracture continuum model (FCM). The FCM is a stochastic method that maps the permeability of discrete fractures onto a regular grid. The original method by McKenna and Reeves (2005) has been updated to provide capabilities that enhance representation of fractured rock. As reported in Hadgu et al. (2015) the method was first modified to include fully three-dimensional representations of anisotropic permeability, multiple independent fracture sets, and arbitrary fracture dips and orientations, and spatial correlation. More recently the FCM has been extended to include three different methods. (1) The Sequential Gaussian Simulation (SGSIM) method uses spatial correlation to generate fractures and define their properties for FCM (2) The ELLIPSIM method randomly generates a specified number of ellipses with properties defined by probability distributions. Each ellipse represents a single fracture. (3) Direct conversion of discrete fracture network (DFN) output. Test simulations were conducted to simulate flow and transport using ELLIPSIM and direct conversion of DFN methods. The simulations used a 1 km x 1km x 1km model domain and a structured with grid block of size of 10 m x 10m x 10m, resulting in a total of 106 grid blocks. Distributions of fracture parameters were used to generate a selected number of realizations. For each realization, the different methods were applied to generate representative permeability fields. The PFLOTRAN (Hammond et al., 2014) code was used to simulate flow and transport in the domain. Simulation results and analysis are presented. The results indicate that the FCM approach is a viable method to model fractured crystalline rocks. The FCM is a computationally efficient way to generate realistic representation of complex fracture systems. This approach is of interest for nuclear waste disposal models applied over large domains. SAND2016-7509 A

  8. SEEK: a systems biology data and model management platform.

    PubMed

    Wolstencroft, Katherine; Owen, Stuart; Krebs, Olga; Nguyen, Quyen; Stanford, Natalie J; Golebiewski, Martin; Weidemann, Andreas; Bittkowski, Meik; An, Lihua; Shockley, David; Snoep, Jacky L; Mueller, Wolfgang; Goble, Carole

    2015-07-11

    Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems biology models. There are a large number of public repositories for storing biological data of a particular type, for example transcriptomics or proteomics, and there are several model repositories. However, this silo-type storage of data and models is not conducive to systems biology investigations. Interdependencies between multiple omics datasets and between datasets and models are essential. Researchers require an environment that will allow the management and sharing of heterogeneous data and models in the context of the experiments which created them. The SEEK is a suite of tools to support the management, sharing and exploration of data and models in systems biology. The SEEK platform provides an access-controlled, web-based environment for scientists to share and exchange data and models for day-to-day collaboration and for public dissemination. A plug-in architecture allows the linking of experiments, their protocols, data, models and results in a configurable system that is available 'off the shelf'. Tools to run model simulations, plot experimental data and assist with data annotation and standardisation combine to produce a collection of resources that support analysis as well as sharing. Underlying semantic web resources additionally extract and serve SEEK metadata in RDF (Resource Description Format). SEEK RDF enables rich semantic queries, both within SEEK and between related resources in the web of Linked Open Data. The SEEK platform has been adopted by many systems biology consortia across Europe. It is a data management environment that has a low barrier of uptake and provides rich resources for collaboration. This paper provides an update on the functions and features of the SEEK software, and describes the use of the SEEK in the SysMO consortium (Systems biology for Micro-organisms), and the VLN (virtual Liver Network), two large systems biology initiatives with different research aims and different scientific communities.

  9. 17 CFR 49.19 - Core principles applicable to registered swap data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with core principles. To be registered, and maintain...

  10. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Disclosure requirements of...

  11. 17 CFR 49.26 - Disclosure requirements of swap data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TRADING COMMISSION SWAP DATA REPOSITORIES § 49.26 Disclosure requirements of swap data repositories... swap data repository shall furnish to the reporting entity a disclosure document that contains the... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Disclosure requirements of...

  12. 21 CFR 522.480 - Repository corticotropin injection.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Repository corticotropin injection. 522.480 Section 522.480 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... § 522.480 Repository corticotropin injection. (a)(1) Specifications. The drug conforms to repository...

  13. 10 CFR 960.3-1-3 - Regionality.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-1-3 Regionality. In making site recommendations for repository development after the site for the first repository has been recommended, the Secretary shall give due... repositories. Such consideration shall take into account the proximity of sites to locations at which waste is...

  14. Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2016-04-01

    Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobes, Vladimir; Scaglione, John M; Wagner, John C

    Spent nuclear fuel (SNF) management practices in the United States rely on dry storage systems that include both canister- and cask-based systems. The United States Department of Energy Used Fuel Disposition Campaign is examining the feasibility of direct disposal of dual-purpose (storage and transportation) canisters (DPCs) in a geological repository. One of the major technical challenges for direct disposal is the ability to demonstrate the subcriticality of the DPCs loaded with SNF for the repository performance period (e.g., 10,000 years or more) as the DPCs may undergo degradation over time. Specifically, groundwater ingress into the DPC (i.e., flooding) could allowmore » the system to achieve criticality in scenarios where the neutron absorber plates in the DPC basket have degraded. However, as was shown by Banerjee et al., some aqueous species in the groundwater provide noticeable reactivity reduction for these systems. For certain amounts of particular aqueous species (e.g., chlorine, lithium) in the groundwater, subcriticality can be demonstrated even for DPCs with complete degradation of the neutron absorber plates or a degraded fuel basket configuration. It has been demonstrated that chlorine is the leading impurity, as indicated by significant neutron absorption in the water that is available in reasonable quantities for the deep geological repository media under consideration. This paper presents the results of an investigation of the available integral experiments worldwide that could be used to validate DPC disposal criticality evaluations, including credit for chlorine. Due to the small number of applicable critical configurations, validation through traditional trending analysis was not possible. The bias in the eigenvalue of the application systems due only to the chlorine was calculated using TSURFER analysis and found to be on the order of 100 percent mille (1 pcm = 10 -5 k eff). This study investigated the design of a series of critical configurations with varying amounts of chlorine to address validation gaps. Such integral experiments would support the crediting of the chlorine neutron-absorption properties in groundwater and the demonstration of subcriticality for DPCs in deep geologic repositories with sufficient chlorine availability.« less

  16. Validation Study for Crediting Chlorine in Criticality Analyses for US Spent Nuclear Fuel Disposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobes, Vladimir; Scaglione, John M.; Wagner, John C.

    2015-01-01

    Spent nuclear fuel (SNF) management practices in the United States rely on dry storage systems that include both canister- and cask-based systems. The United States Department of Energy Used Fuel Disposition Campaign is examining the feasibility of direct disposal of dual-purpose (storage and transportation) canisters (DPCs) in a geological repository. One of the major technical challenges for direct disposal is the ability to demonstrate the subcriticality of the DPCs loaded with SNF for the repository performance period (e.g., 10,000 years or more) as the DPCs may undergo degradation over time. Specifically, groundwater ingress into the DPC (i.e., flooding) could allowmore » the system to achieve criticality in scenarios where the neutron absorber plates in the DPC basket have degraded. However, as was shown by Banerjee et al., some aqueous species in the groundwater provide noticeable reactivity reduction for these systems. For certain amounts of particular aqueous species (e.g., chlorine, lithium) in the groundwater, subcriticality can be demonstrated even for DPCs with complete degradation of the neutron absorber plates or a degraded fuel basket configuration. It has been demonstrated that chlorine is the leading impurity, as indicated by significant neutron absorption in the water that is available in reasonable quantities for the deep geological repository media under consideration. This paper presents the results of an investigation of the available integral experiments worldwide that could be used to validate DPC disposal criticality evaluations, including credit for chlorine. Due to the small number of applicable critical configurations, validation through traditional trending analysis was not possible. The bias in the eigenvalue of the application systems due only to the chlorine was calculated using TSURFER analysis and found to be on the order of 100 percent mille (1 pcm = 10 -5 k eff). This study investigated the design of a series of critical configurations with varying amounts of chlorine to address validation gaps. Such integral experiments would support the crediting of the chlorine neutron-absorption properties in groundwater and the demonstration of subcriticality for DPCs in deep geologic repositories with sufficient chlorine availability.« less

  17. FY16 Summary Report: Participation in the KOSINA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matteo, Edward N.; Hansen, Francis D.

    Salt formations represent a promising host for disposal of nuclear waste in the United States and Germany. Together, these countries provided fully developed safety cases for bedded salt and domal salt, respectively. Today, Germany and the United States find themselves in similar positions with respect to salt formations serving as repositories for heat-generating nuclear waste. German research centers are evaluating bedded and pillow salt formations to contrast with their previous safety case made for the Gorleben dome. Sandia National Laboratories is collaborating on this effort as an Associate Partner, and this report summarizes that teamwork. Sandia and German research groupsmore » have a long-standing cooperative approach to repository science, engineering, operations, safety assessment, testing, modeling and other elements comprising the basis for salt disposal. Germany and the United States hold annual bilateral workshops, which cover a spectrum of issues surrounding the viability of salt formations. Notably, recent efforts include development of a database for features, events, and processes applying broadly and generically to bedded and domal salt. Another international teaming activity evaluates salt constitutive models, including hundreds of new experiments conducted on bedded salt from the Waste Isolation Pilot Plant. These extensive collaborations continue to build the scientific basis for salt disposal. Repository deliberations in the United States are revisiting bedded and domal salt for housing a nuclear waste repository. By agreeing to collaborate with German peers, our nation stands to benefit by assurance of scientific position, exchange of operational concepts, and approach to elements of the safety case, all reflecting cost and time efficiency.« less

  18. Industrial Program of Waste Management - Cigeo Project - 13033

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butez, Marc; Bartagnon, Olivier; Gagner, Laurent

    2013-07-01

    The French Planning Act of 28 June 2006 prescribed that a reversible repository in a deep geological formation be chosen as the reference solution for the long-term management of high-level and intermediate-level long-lived radioactive waste. It also entrusted the responsibility of further studies and design of the repository (named Cigeo) upon the French Radioactive Waste Management Agency (Andra), in order for the review of the creation-license application to start in 2015 and, subject to its approval, the commissioning of the repository to take place in 2025. Andra is responsible for siting, designing, implementing, operating the future geological repository, including operationalmore » and long term safety and waste acceptance. Nuclear operators (Electricite de France (EDF), AREVA NC, and the French Commission in charge of Atomic Energy and Alternative Energies (CEA) are technically and financially responsible for the waste they generate, with no limit in time. They provide Andra, on one hand, with waste packages related input data, and on the other hand with their long term industrial experiences of high and intermediate-level long-lived radwaste management and nuclear operation. Andra, EDF, AREVA and CEA established a cooperation agreement for strengthening their collaborations in these fields. Within this agreement Andra and the nuclear operators have defined an industrial program for waste management. This program includes the waste inventory to be taken into account for the design of the Cigeo project and the structural hypothesis underlying its phased development. It schedules the delivery of the different categories of waste and defines associated flows. (authors)« less

  19. Forecasting Maintenance Shortcomings of a Planned Equipment Density Listing in Support of Expeditionary Missions

    DTIC Science & Technology

    2017-06-01

    importantly, it examines the methodology used to build the class IX block embarked on ship prior to deployment. The class IX block is defined as a repository...compared to historical data to evaluate model and simulation outputs. This thesis provides recommendations on improving the methodology implemented in...improving the level of organic support available to deployed units. More importantly, it examines the methodology used to build the class IX block

  20. Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spinti, Jennifer; Birgenheier, Lauren; Deo, Milind

    This report summarizes the significant findings from the Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources program sponsored by the Department of Energy through the National Energy Technology Laboratory. There were four principle areas of research; Environmental, legal, and policy issues related to development of oil shale and oil sands resources; Economic and environmental assessment of domestic unconventional fuels industry; Basin-scale assessment of conventional and unconventional fuel development impacts; and Liquid fuel production by in situ thermal processing of oil shale Multiple research projects were conducted in each area and the results have been communicated viamore » sponsored conferences, conference presentations, invited talks, interviews with the media, numerous topical reports, journal publications, and a book that summarizes much of the oil shale research relating to Utah’s Uinta Basin. In addition, a repository of materials related to oil shale and oil sands has been created within the University of Utah’s Institutional Repository, including the materials generated during this research program. Below is a listing of all topical and progress reports generated by this project and submitted to the Office of Science and Technical Information (OSTI). A listing of all peer-reviewed publications generated as a result of this project is included at the end of this report; Geomechanical and Fluid Transport Properties 1 (December, 2015); Validation Results for Core-Scale Oil Shale Pyrolysis (February, 2015); and Rates and Mechanisms of Oil Shale Pyrolysis: A Chemical Structure Approach (November, 2014); Policy Issues Associated With Using Simulation to Assess Environmental Impacts (November, 2014); Policy Analysis of the Canadian Oil Sands Experience (September, 2013); V-UQ of Generation 1 Simulator with AMSO Experimental Data (August, 2013); Lands with Wilderness Characteristics, Resource Management Plan Constraints, and Land Exchanges (March, 2012); Conjunctive Surface and Groundwater Management in Utah: Implications for Oil Shale and Oil Sands Development (May, 2012); Development of CFD-Based Simulation Tools for In Situ Thermal Processing of Oil Shale/Sands (February, 2012); Core-Based Integrated Sedimentologic, Stratigraphic, and Geochemical Analysis of the Oil Shale Bearing Green River Formation, Uinta Basin, Utah (April, 2011); Atomistic Modeling of Oil Shale Kerogens and Asphaltenes Along with their Interactions with the Inorganic Mineral Matrix (April, 2011); Pore Scale Analysis of Oil Shale/Sands Pyrolysis (March, 2011); Land and Resource Management Issues Relevant to Deploying In-Situ Thermal Technologies (January, 2011); Policy Analysis of Produced Water Issues Associated with In-Situ Thermal Technologies (January, 2011); and Policy Analysis of Water Availability and Use Issues for Domestic Oil Shale and Oil Sands Development (March, 2010)« less

  1. Diagnosing Parkinson's Diseases Using Fuzzy Neural System

    PubMed Central

    Abiyev, Rahib H.; Abizade, Sanan

    2016-01-01

    This study presents the design of the recognition system that will discriminate between healthy people and people with Parkinson's disease. A diagnosing of Parkinson's diseases is performed using fusion of the fuzzy system and neural networks. The structure and learning algorithms of the proposed fuzzy neural system (FNS) are presented. The approach described in this paper allows enhancing the capability of the designed system and efficiently distinguishing healthy individuals. It was proved through simulation of the system that has been performed using data obtained from UCI machine learning repository. A comparative study was carried out and the simulation results demonstrated that the proposed fuzzy neural system improves the recognition rate of the designed system. PMID:26881009

  2. 17 CFR 49.19 - Core principles applicable to registered swap data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with core principles. To be registered, and maintain registration, a swap data...

  3. 17 CFR 49.19 - Core principles applicable to registered swap data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... registered swap data repositories. 49.19 Section 49.19 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP DATA REPOSITORIES § 49.19 Core principles applicable to registered swap data repositories. (a) Compliance with Core Principles. To be registered, and maintain registration, a swap data...

  4. 17 CFR 49.22 - Chief compliance officer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... that the registered swap data repository provide fair and open access as set forth in § 49.27 of this...) SWAP DATA REPOSITORIES § 49.22 Chief compliance officer. (a) Definition of Board of Directors. For... data repository, or for those swap data repositories whose organizational structure does not include a...

  5. 75 FR 8701 - Notice of Settlement Agreement Pertaining to Construction of a Waste Repository on the Settlors...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ... Construction of a Waste Repository on the Settlors' Property Pursuant to the Comprehensive Environmental... a Settlement Agreement pertaining to Construction of a Waste Repository on Settlor's Property... waste repository on the property by resolving, liability the settling party might otherwise incur under...

  6. 10 CFR 51.67 - Environmental information concerning geologic repositories.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Environmental information concerning geologic repositories... information concerning geologic repositories. (a) In lieu of an environmental report, the Department of Energy... connection with any geologic repository developed under Subtitle A of Title I, or under Title IV, of the...

  7. 15 CFR 1180.10 - NTIS permanent repository.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false NTIS permanent repository. 1180.10... ENGINEERING INFORMATION TO THE NATIONAL TECHNICAL INFORMATION SERVICE § 1180.10 NTIS permanent repository. A... repository as a service to agencies unless the Director advises the Liaison Officer that it has not been so...

  8. 76 FR 14028 - Center for Devices and Radiological Health 510(k) Implementation: Online Repository of Medical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ...] Center for Devices and Radiological Health 510(k) Implementation: Online Repository of Medical Device... public meeting entitled ``510(k) Implementation: Discussion of an Online Repository of Medical Device... establish an online public repository of medical device labeling and strategies for displaying device...

  9. Identifying Tensions in the Use of Open Licenses in OER Repositories

    ERIC Educational Resources Information Center

    Amiel, Tel; Soares, Tiago Chagas

    2016-01-01

    We present an analysis of 50 repositories for educational content conducted through an "audit system" that helped us classify these repositories, their software systems, promoters, and how they communicated their licensing practices. We randomly accessed five resources from each repository to investigate the alignment of licensing…

  10. 10 CFR 63.113 - Performance objectives for the geologic repository after permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository after...-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical Criteria Postclosure Performance Objectives § 63.113 Performance objectives for the geologic repository after permanent...

  11. 10 CFR 63.111 - Performance objectives for the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance objectives for the geologic repository... (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA... repository operations area through permanent closure. (a) Protection against radiation exposures and releases...

  12. 10 CFR 960.3-1 - Siting provisions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-1 Siting provisions. The siting provisions establish the... repositories. As required by the Act, § 960.3-1-3 specifies consideration of a regional distribution of repositories after recommendation of a site for development of the first repository. Section 960.3-1-4...

  13. 10 CFR 60.112 - Overall system performance objective for the geologic repository after permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... repository after permanent closure. 60.112 Section 60.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.112 Overall system performance objective for the geologic repository after permanent closure...

  14. 10 CFR 60.132 - Additional design criteria for surface facilities in the geologic repository operations area.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... geologic repository operations area. 60.132 Section 60.132 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.132 Additional design criteria for surface facilities in...

  15. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance of the geologic repository operations area... OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Technical Criteria Performance Objectives § 60.111 Performance of the geologic repository operations area through permanent closure. (a...

  16. Institutional Repositories as Infrastructures for Long-Term Preservation

    ERIC Educational Resources Information Center

    Francke, Helena; Gamalielsson, Jonas; Lundell, Björn

    2017-01-01

    Introduction: The study describes the conditions for long-term preservation of the content of the institutional repositories of Swedish higher education institutions based on an investigation of how deposited files are managed with regards to file format and how representatives of the repositories describe the functions of the repositories.…

  17. Personal Name Identification in the Practice of Digital Repositories

    ERIC Educational Resources Information Center

    Xia, Jingfeng

    2006-01-01

    Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…

  18. WholeCellSimDB: a hybrid relational/HDF database for whole-cell model predictions

    PubMed Central

    Karr, Jonathan R.; Phillips, Nolan C.; Covert, Markus W.

    2014-01-01

    Mechanistic ‘whole-cell’ models are needed to develop a complete understanding of cell physiology. However, extracting biological insights from whole-cell models requires running and analyzing large numbers of simulations. We developed WholeCellSimDB, a database for organizing whole-cell simulations. WholeCellSimDB was designed to enable researchers to search simulation metadata to identify simulations for further analysis, and quickly slice and aggregate simulation results data. In addition, WholeCellSimDB enables users to share simulations with the broader research community. The database uses a hybrid relational/hierarchical data format architecture to efficiently store and retrieve both simulation setup metadata and results data. WholeCellSimDB provides a graphical Web-based interface to search, browse, plot and export simulations; a JavaScript Object Notation (JSON) Web service to retrieve data for Web-based visualizations; a command-line interface to deposit simulations; and a Python API to retrieve data for advanced analysis. Overall, we believe WholeCellSimDB will help researchers use whole-cell models to advance basic biological science and bioengineering. Database URL: http://www.wholecellsimdb.org Source code repository URL: http://github.com/CovertLab/WholeCellSimDB PMID:25231498

  19. System and method for responding to ground and flight system malfunctions

    NASA Technical Reports Server (NTRS)

    Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)

    2010-01-01

    A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.

  20. 17 CFR 49.9 - Duties of registered swap data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... privacy of any and all swap data and any other related information that the swap data repository receives... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Duties of registered swap data... (CONTINUED) SWAP DATA REPOSITORIES § 49.9 Duties of registered swap data repositories. (a) Duties. To be...

  1. Availability and Accessibility in an Open Access Institutional Repository: A Case Study

    ERIC Educational Resources Information Center

    Lee, Jongwook; Burnett, Gary; Vandegrift, Micah; Baeg, Jung Hoon; Morris, Richard

    2015-01-01

    Introduction: This study explores the extent to which an institutional repository makes papers available and accessible on the open Web by using 170 journal articles housed in DigiNole Commons, the institutional repository at Florida State University. Method: To analyse the repository's impact on availability and accessibility, we conducted…

  2. The Use of Digital Repositories for Enhancing Teacher Pedagogical Performance

    ERIC Educational Resources Information Center

    Cohen, Anat; Kalimi, Sharon; Nachmias, Rafi

    2013-01-01

    This research examines the usage of local learning material repositories at school, as well as related teachers' attitudes and training. The study investigates the use of these repositories for enhancing teacher performance and assesses whether the assimilation of the local repositories increases their usage of and contribution to by teachers. One…

  3. Institutional Repositories in Indian Universities and Research Institutes: A Study

    ERIC Educational Resources Information Center

    Krishnamurthy, M.; Kemparaju, T. D.

    2011-01-01

    Purpose: The purpose of this paper is to report on a study of the institutional repositories (IRs) in use in Indian universities and research institutes. Design/methodology/approach: Repositories in various institutions in India were accessed and described in a standardised way. Findings: The 20 repositories studied covered collections of diverse…

  4. 10 CFR 63.161 - Emergency plan for the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Emergency plan for the geologic repository operations area... OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Emergency Planning Criteria § 63.161 Emergency plan for the geologic repository operations area through permanent...

  5. 77 FR 26709 - Swap Data Repositories: Interpretative Statement Regarding the Confidentiality and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 49 RIN 3038-AD83 Swap Data Repositories... data repositories (``SDRs'').SDRs are new registered entities created by section 728 of the Dodd-Frank... Act amends section 1a of the CEA to add a definition of the term ``swap data repository.'' Pursuant to...

  6. Online Paper Repositories and the Role of Scholarly Societies: An AERA Conference Report

    ERIC Educational Resources Information Center

    Educational Researcher, 2010

    2010-01-01

    This article examines issues faced by scholarly societies that are developing and sustaining online paper repositories. It is based on the AERA Conference on Online Paper Repositories, which focused on fundamental issues of policy and procedure important to the operations of online working paper repositories. The report and recommendations address…

  7. 10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... of each repository site. For the second repository, at least three of the sites shall not have been nominated previously. Any site nominated as suitable for characterization for the first repository, but not...

  8. Repositories for Research: Southampton's Evolving Role in the Knowledge Cycle

    ERIC Educational Resources Information Center

    Simpson, Pauline; Hey, Jessie

    2006-01-01

    Purpose: To provide an overview of how open access (OA) repositories have grown to take a premier place in the e-research knowledge cycle and offer Southampton's route from project to sustainable institutional repository. Design/methodology/approach: The evolution of institutional repositories and OA is outlined raising questions of multiplicity…

  9. Development of the performance confirmation program at YUCCA mountain, nevada

    USGS Publications Warehouse

    LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.

    2006-01-01

    The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.

  10. Facilitating Cohort Discovery by Enhancing Ontology Exploration, Query Management and Query Sharing for Large Clinical Data Repositories.

    PubMed

    Tao, Shiqiang; Cui, Licong; Wu, Xi; Zhang, Guo-Qiang

    2017-01-01

    To help researchers better access clinical data, we developed a prototype query engine called DataSphere for exploring large-scale integrated clinical data repositories. DataSphere expedites data importing using a NoSQL data management system and dynamically renders its user interface for concept-based querying tasks. DataSphere provides an interactive query-building interface together with query translation and optimization strategies, which enable users to build and execute queries effectively and efficiently. We successfully loaded a dataset of one million patients for University of Kentucky (UK) Healthcare into DataSphere with more than 300 million clinical data records. We evaluated DataSphere by comparing it with an instance of i2b2 deployed at UK Healthcare, demonstrating that DataSphere provides enhanced user experience for both query building and execution.

  11. Facilitating Cohort Discovery by Enhancing Ontology Exploration, Query Management and Query Sharing for Large Clinical Data Repositories

    PubMed Central

    Tao, Shiqiang; Cui, Licong; Wu, Xi; Zhang, Guo-Qiang

    2017-01-01

    To help researchers better access clinical data, we developed a prototype query engine called DataSphere for exploring large-scale integrated clinical data repositories. DataSphere expedites data importing using a NoSQL data management system and dynamically renders its user interface for concept-based querying tasks. DataSphere provides an interactive query-building interface together with query translation and optimization strategies, which enable users to build and execute queries effectively and efficiently. We successfully loaded a dataset of one million patients for University of Kentucky (UK) Healthcare into DataSphere with more than 300 million clinical data records. We evaluated DataSphere by comparing it with an instance of i2b2 deployed at UK Healthcare, demonstrating that DataSphere provides enhanced user experience for both query building and execution. PMID:29854239

  12. InvestigationOrganizer: The Development and Testing of a Web-based Tool to Support Mishap Investigations

    NASA Technical Reports Server (NTRS)

    Carvalho, Robert F.; Williams, James; Keller, Richard; Sturken, Ian; Panontin, Tina

    2004-01-01

    InvestigationOrganizer (IO) is a collaborative web-based system designed to support the conduct of mishap investigations. IO provides a common repository for a wide range of mishap related information, and allows investigators to make explicit, shared, and meaningful links between evidence, causal models, findings and recommendations. It integrates the functionality of a database, a common document repository, a semantic knowledge network, a rule-based inference engine, and causal modeling and visualization. Thus far, IO has been used to support four mishap investigations within NASA, ranging from a small property damage case to the loss of the Space Shuttle Columbia. This paper describes how the functionality of IO supports mishap investigations and the lessons learned from the experience of supporting two of the NASA mishap investigations: the Columbia Accident Investigation and the CONTOUR Loss Investigation.

  13. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    ERIC Educational Resources Information Center

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  14. Win–win data sharing in neuroscience

    PubMed Central

    Ascoli, Giorgio A; Maraver, Patricia; Nanda, Sumit; Polavaram, Sridevi; Armañanzas, Rubén

    2017-01-01

    Most neuroscientists have yet to embrace a culture of data sharing. Using our decade-long experience at NeuroMorpho.Org as an example, we discuss how publicly available repositories may benefit data producers and end-users alike. We outline practical recipes for resource developers to maximize the research impact of data sharing platforms for both contributors and users. PMID:28139675

  15. Lift Ev'ry Voice: The Resounding Experiences of Black Male Student-Athletes at a California Community College

    ERIC Educational Resources Information Center

    McClellan, Michael L.

    2013-01-01

    Black male student-athletes are entering the California community college (CCC) system at an unprecedented rate. CCCs have become a repository for Black males that have aspirations of competing in the National Collegiate Athletic Association (NCAA) Division I member institutions. This historically disenfranchised subgroup of students is required…

  16. Interoperability Across the Stewardship Spectrum in the DataONE Repository Federation

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Vieglais, D.; Wilson, B. E.

    2016-12-01

    Thousands of earth and environmental science repositories serve many researchers and communities, each with their own community and legal mandates, sustainability models, and historical infrastructure. These repositories span the stewardship spectrum from highly curated collections that employ large numbers of staff members to review and improve data, to small, minimal budget repositories that accept data caveat emptor and where all responsibility for quality lies with the submitter. Each repository fills a niche, providing services that meet the stewardship tradeoffs of one or more communities. We have reviewed these stewardship tradeoffs for several DataONE member repositories ranging from minimally (KNB) to highly curated (Arctic Data Center), as well as general purpose (Dryad) to highly discipline or project specific (NEON). The rationale behind different levels of stewardship reflect resolution of these tradeoffs. Some repositories aim to encourage extensive uptake by keeping processes simple and minimizing the amount of information collected, but this limits the long-term utility of the data and the search, discovery, and integration systems that are possible. Other repositories require extensive metadata input, review, and assessment, allowing for excellent preservation, discovery, and integration but at the cost of significant time for submitters and expense for curatorial staff. DataONE recognizes these different levels of curation, and attempts to embrace them to create a federation that is useful across the stewardship spectrum. DataONE provides a tiered model for repositories with growing utility of DataONE services at higher tiers of curation. The lowest tier supports read-only access to data and requires little more than title and contact metadata. Repositories can gradually phase in support for higher levels of metadata and services as needed. These tiered capabilities are possible through flexible support for multiple metadata standards and services, where repositories can incrementally increase their requirements as they want to satisfy more use cases. Within DataONE, metadata search services support minimal metadata models, but significantly expanded precision and recall become possible when repositories provide more extensively curated metadata.

  17. Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.

  18. Building Scientific Data's list of recommended data repositories

    NASA Astrophysics Data System (ADS)

    Hufton, A. L.; Khodiyar, V.; Hrynaszkiewicz, I.

    2016-12-01

    When Scientific Data launched in 2014 we provided our authors with a list of recommended data repositories to help them identify data hosting options that were likely to meet the journal's requirements. This list has grown in size and scope, and is now a central resource for authors across the Nature-titled journals. It has also been used in the development of data deposition policies and recommended repository lists across Springer Nature and at other publishers. Each new addition to the list is assessed according to a series of criteria that emphasize the stability of the resource, its commitment to principles of open science and its implementation of relevant community standards and reporting guidelines. A preference is expressed for repositories that issue digital object identifiers (DOIs) through the DataCite system and that share data under the Creative Commons CC0 waiver. Scientific Data currently lists fourteen repositories that focus on specific areas within the Earth and environmental sciences, as well as the broad scope repositories, Dryad and figshare. Readers can browse and filter datasets published at the journal by the host repository using ISA-explorer, a demo tool built by the ISA-tools team at Oxford University1. We believe that well-maintained lists like this one help publishers build a network of trust with community data repositories and provide an important complement to more comprehensive data repository indices and more formal certification efforts. In parallel, Scientific Data has also improved its policies to better support submissions from authors using institutional and project-specific repositories, without requiring each to apply for listing individually. Online resources Journal homepage: http://www.nature.com/scientificdata Data repository criteria: http://www.nature.com/sdata/policies/data-policies#repo-criteria Recommended data repositories: http://www.nature.com/sdata/policies/repositories Archived copies of the list: https://dx.doi.org/10.6084/m9.figshare.1434640.v6 Reference Gonzalez-Beltran, A. ISA-explorer: A demo tool for discovering and exploring Scientific Data's ISA-tab metadata. Scientific Data Updates http://blogs.nature.com/scientificdata/2015/12/17/isa-explorer/ (2015).

  19. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  20. Actinide Sorption in a Brine/Dolomite Rock System: Evaluating the Degree of Conservatism in Kd Ranges used in Performance Assessment Modeling for the WIPP Nuclear Waste Repository

    NASA Astrophysics Data System (ADS)

    Dittrich, T. M.; Reed, D. T.

    2015-12-01

    The Waste Isolation Pilot Plant (WIPP) near Carlsbad, NM is the only operating nuclear waste repository in the US and has been accepting transuranic (TRU) waste since 1999. The WIPP is located in a salt deposit approximately 650 m below the surface and performance assessment (PA) modeling for a 10,000 year period is required to recertify the operating license with the US EPA every five years. The main pathway of concern for environmental release of radioactivity is a human intrusion caused by drilling into a pressurized brine reservoir below the repository. This could result in the flooding of the repository and subsequent transport in the high transmissivity layer (dolomite-rich Culebra formation) above the waste disposal rooms. We evaluate the degree of conservatism in the estimated sorption partition coefficients (Kds) ranges used in the PA based on an approach developed with granite rock and actinides (Dittrich and Reimus, 2015; Dittrich et al., 2015). Sorption onto the waste storage material (Fe drums) may also play a role in mobile actinide concentrations. We will present (1) a conceptual overview of how Kds are used in the PA model, (2) technical background of the evolution of the ranges and (3) results from batch and column experiments and model predictions for Kds with WIPP dolomite and clays, brine with various actinides, and ligands (e.g., acetate, citrate, EDTA) that could promote transport. The current Kd ranges used in performance models are based on oxidation state and are 5-400, 0.5-10,000, 0.03-200, and 0.03-20 mL g-1 for elements with oxidation states of III, IV, V, and VI, respectively. Based on redox conditions predicted in the brines, possible actinide species include Pu(III), Pu(IV), U(IV), U(VI), Np(IV), Np(V), Am(III), and Th(IV). We will also discuss the challenges of upscaling from lab experiments to field scale predictions, the role of colloids, and the effect of engineered barrier materials (e.g., MgO) on transport conditions. Dittrich, T.M., Reimus, P.W. 2015. Uranium transport in a crushed granodiorite: experiments and reactive transport modeling. J Contam Hydrol 175-176: 44-59. Dittrich, T.M., Boukhalfa, H., Ware, S.D., Reimus, P.W. 2015. Laboratory investigation of the role of desorption kinetics on americium transport associated with bentonite colloids. J Environ Radioactiv 148: 170-182.

  1. Selecting materialized views using random algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi

    2007-04-01

    The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.

  2. FISSA: A neuropil decontamination toolbox for calcium imaging signals.

    PubMed

    Keemink, Sander W; Lowe, Scott C; Pakan, Janelle M P; Dylda, Evelyn; van Rossum, Mark C W; Rochefort, Nathalie L

    2018-02-22

    In vivo calcium imaging has become a method of choice to image neuronal population activity throughout the nervous system. These experiments generate large sequences of images. Their analysis is computationally intensive and typically involves motion correction, image segmentation into regions of interest (ROIs), and extraction of fluorescence traces from each ROI. Out of focus fluorescence from surrounding neuropil and other cells can strongly contaminate the signal assigned to a given ROI. In this study, we introduce the FISSA toolbox (Fast Image Signal Separation Analysis) for neuropil decontamination. Given pre-defined ROIs, the FISSA toolbox automatically extracts the surrounding local neuropil and performs blind-source separation with non-negative matrix factorization. Using both simulated and in vivo data, we show that this toolbox performs similarly or better than existing published methods. FISSA requires only little RAM, and allows for fast processing of large datasets even on a standard laptop. The FISSA toolbox is available in Python, with an option for MATLAB format outputs, and can easily be integrated into existing workflows. It is available from Github and the standard Python repositories.

  3. An update on sORFs.org: a repository of small ORFs identified by ribosome profiling.

    PubMed

    Olexiouk, Volodimir; Van Criekinge, Wim; Menschaert, Gerben

    2018-01-04

    sORFs.org (http://www.sorfs.org) is a public repository of small open reading frames (sORFs) identified by ribosome profiling (RIBO-seq). This update elaborates on the major improvements implemented since its initial release. sORFs.org now additionally supports three more species (zebrafish, rat and Caenorhabditis elegans) and currently includes 78 RIBO-seq datasets, a vast increase compared to the three that were processed in the initial release. Therefore, a novel pipeline was constructed that also enables sORF detection in RIBO-seq datasets comprising solely elongating RIBO-seq data while previously, matching initiating RIBO-seq data was necessary to delineate the sORFs. Furthermore, a novel noise filtering algorithm was designed, able to distinguish sORFs with true ribosomal activity from simulated noise, consequently reducing the false positive identification rate. The inclusion of other species also led to the development of an inner BLAST pipeline, assessing sequence similarity between sORFs in the repository. Building on the proof of concept model in the initial release of sORFs.org, a full PRIDE-ReSpin pipeline was now released, reprocessing publicly available MS-based proteomics PRIDE datasets, reporting on true translation events. Next to reporting those identified peptides, sORFs.org allows visual inspection of the annotated spectra within the Lorikeet MS/MS viewer, thus enabling detailed manual inspection and interpretation. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. The Emperor's New Repository

    ERIC Educational Resources Information Center

    Chudnov, Daniel

    2008-01-01

    The author does not know the first thing about building digital repositories. Maybe that is a strange thing to say, given that he works in a repository development group now, worked on the original DSpace project years ago, and worked on a few repository research projects in between. Given how long he has been around people and projects aiming to…

  5. Genetic conservation, characterization and utilization of wild relatives of fruit and nut crops at the USDA Germplasm Repository in Davis, California

    USDA-ARS?s Scientific Manuscript database

    The National Clonal Germplasm Repository (NCGR) in Davis is one among the nine repositories in the National Plant Germplasm System, USDA-ARS that is responsible for conservation of clonally propagated woody perennial subtropical and temperate fruit and nut crop germplasm. Currently the repository ho...

  6. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... repository possesses the capability to provide adequate long-term curatorial services. 79.9 Section 79.9... FEDERALLY-OWNED AND ADMINISTERED ARCHAEOLOGICAL COLLECTIONS § 79.9 Standards to determine when a repository... shall determine that a repository has the capability to provide adequate long-term curatorial services...

  7. 10 CFR Appendix II to Part 960 - NRC and EPA Requirements for Preclosure Repository Performance

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false NRC and EPA Requirements for Preclosure Repository... SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. II Appendix II to Part 960—NRC and EPA Requirements for Preclosure Repository Performance Under proposed 40 CFR part 191, subpart A...

  8. 10 CFR 51.109 - Public hearings in proceedings for issuance of materials license with respect to a geologic...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... license with respect to a geologic repository. 51.109 Section 51.109 Energy NUCLEAR REGULATORY COMMISSION... Public hearings in proceedings for issuance of materials license with respect to a geologic repository... waste repository at a geologic repository operations area under parts 60 and 63 of this chapter, and in...

  9. Semantic Linking of Learning Object Repositories to DBpedia

    ERIC Educational Resources Information Center

    Lama, Manuel; Vidal, Juan C.; Otero-Garcia, Estefania; Bugarin, Alberto; Barro, Senen

    2012-01-01

    Large-sized repositories of learning objects (LOs) are difficult to create and also to maintain. In this paper we propose a way to reduce this drawback by improving the classification mechanisms of the LO repositories. Specifically, we present a solution to automate the LO classification of the Universia repository, a collection of more than 15…

  10. 10 CFR Appendix I to Part 960 - NRC and EPA Requirements for Postclosure Repository Performance

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false NRC and EPA Requirements for Postclosure Repository... SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. I Appendix I to Part 960—NRC and EPA Requirements for Postclosure Repository Performance Under proposed 40 CFR part 191, subpart B...

  11. Desiderata for Healthcare Integrated Data Repositories Based on Architectural Comparison of Three Public Repositories

    PubMed Central

    Huser, Vojtech; Cimino, James J.

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366

  12. Desiderata for healthcare integrated data repositories based on architectural comparison of three public repositories.

    PubMed

    Huser, Vojtech; Cimino, James J

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.

  13. Effect of Low-Temperature Sensitization on the Corrosion Behavior of AISI Type 304L SS Weld Metal in Simulated Groundwater

    NASA Astrophysics Data System (ADS)

    Suresh, Girija; Nandakumar, T.; Viswanath, A.

    2018-04-01

    The manuscript presents the investigations carried out on the effect of low-temperature sensitization (LTS) of 304L SS weld metal on its corrosion behavior in simulated groundwater, for its application as a canister material for long-term storage of nuclear vitrified high-level waste in geological repositories. AISI type 304L SS weld pad was fabricated by multipass gas tungsten arc welding process using 308L SS filler wire. The as-welded specimens were subsequently subjected to carbide nucleation and further to LTS at 500 °C for 11 days to simulate a temperature of 300 °C for 100-year life of the canister in geological repositories. Delta ferrite (δ-ferrite) content of the 304L SS weld metal substantially decreased on carbide nucleation treatment and further only a marginal decrease occurred on LTS treatment. The microstructure of the as-welded consisted of δ-ferrite as a minor phase distributed in austenite matrix. The δ-ferrite appeared fragmented in the carbide-nucleated and LTS-treated weld metal. The degree of sensitization measured by double-loop electrochemical potentokinetic reactivation method indicated an increase in carbide nucleation treatment when compared to the as-welded specimens, and further increase occurred on LTS treatment. Potentiodynamic anodic polarization investigations in simulated groundwater indicated a substantial decrease in the localized corrosion resistance of the carbide-nucleated and LTS 304L SS weld metals, when compared to the as-welded specimens. Post-experimental micrographs indicated pitting as the primary mode of attack in the as-welded, while pitting and intergranular corrosion (IGC) occurred in the carbide-nucleated weld metal. LTS-treated weld metal predominantly underwent IGC attack. The decrease in the localized corrosion resistance of the weld metal after LTS treatment was found to have a direct correlation with the degree of sensitization and the weld microstructure. The results are detailed in the manuscript.

  14. Use of the Fracture Continuum Model for Numerical Modeling of Flow and Transport of Deep Geologic Disposal of Nuclear Waste in Crystalline Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Kalinina, E.; Klise, K. A.; Wang, Y.

    2015-12-01

    Numerical modeling of disposal of nuclear waste in a deep geologic repository in fractured crystalline rock requires robust characterization of fractures. Various methods for fracture representation in granitic rocks exist. In this study we used the fracture continuum model (FCM) to characterize fractured rock for use in the simulation of flow and transport in the far field of a generic nuclear waste repository located at 500 m depth. The FCM approach is a stochastic method that maps the permeability of discrete fractures onto a regular grid. The method generates permeability fields using field observations of fracture sets. The original method described in McKenna and Reeves (2005) was designed for vertical fractures. The method has since then been extended to incorporate fully three-dimensional representations of anisotropic permeability, multiple independent fracture sets, and arbitrary fracture dips and orientations, and spatial correlation (Kalinina et al. 20012, 2014). For this study the numerical code PFLOTRAN (Lichtner et al., 2015) has been used to model flow and transport. PFLOTRAN solves a system of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport in porous materials. The code is designed to run on massively parallel computing architectures as well as workstations and laptops (e.g. Hammond et al., 2011). Benchmark tests were conducted to simulate flow and transport in a specified model domain. Distributions of fracture parameters were used to generate a selected number of realizations. For each realization, the FCM method was used to generate a permeability field of the fractured rock. The PFLOTRAN code was then used to simulate flow and transport in the domain. Simulation results and analysis are presented. The results indicate that the FCM approach is a viable method to model fractured crystalline rocks. The FCM is a computationally efficient way to generate realistic representation of complex fracture systems. This approach is of interest for nuclear waste disposal models applied over large domains.

  15. Effect of Low-Temperature Sensitization on the Corrosion Behavior of AISI Type 304L SS Weld Metal in Simulated Groundwater

    NASA Astrophysics Data System (ADS)

    Suresh, Girija; Nandakumar, T.; Viswanath, A.

    2018-05-01

    The manuscript presents the investigations carried out on the effect of low-temperature sensitization (LTS) of 304L SS weld metal on its corrosion behavior in simulated groundwater, for its application as a canister material for long-term storage of nuclear vitrified high-level waste in geological repositories. AISI type 304L SS weld pad was fabricated by multipass gas tungsten arc welding process using 308L SS filler wire. The as-welded specimens were subsequently subjected to carbide nucleation and further to LTS at 500 °C for 11 days to simulate a temperature of 300 °C for 100-year life of the canister in geological repositories. Delta ferrite ( δ-ferrite) content of the 304L SS weld metal substantially decreased on carbide nucleation treatment and further only a marginal decrease occurred on LTS treatment. The microstructure of the as-welded consisted of δ-ferrite as a minor phase distributed in austenite matrix. The δ-ferrite appeared fragmented in the carbide-nucleated and LTS-treated weld metal. The degree of sensitization measured by double-loop electrochemical potentokinetic reactivation method indicated an increase in carbide nucleation treatment when compared to the as-welded specimens, and further increase occurred on LTS treatment. Potentiodynamic anodic polarization investigations in simulated groundwater indicated a substantial decrease in the localized corrosion resistance of the carbide-nucleated and LTS 304L SS weld metals, when compared to the as-welded specimens. Post-experimental micrographs indicated pitting as the primary mode of attack in the as-welded, while pitting and intergranular corrosion (IGC) occurred in the carbide-nucleated weld metal. LTS-treated weld metal predominantly underwent IGC attack. The decrease in the localized corrosion resistance of the weld metal after LTS treatment was found to have a direct correlation with the degree of sensitization and the weld microstructure. The results are detailed in the manuscript.

  16. Formalization, Annotation and Analysis of Diverse Drug and Probe Screening Assay Datasets Using the BioAssay Ontology (BAO)

    PubMed Central

    Vempati, Uma D.; Przydzial, Magdalena J.; Chung, Caty; Abeyruwan, Saminda; Mir, Ahsan; Sakurai, Kunie; Visser, Ubbo; Lemmon, Vance P.; Schürer, Stephan C.

    2012-01-01

    Huge amounts of high-throughput screening (HTS) data for probe and drug development projects are being generated in the pharmaceutical industry and more recently in the public sector. The resulting experimental datasets are increasingly being disseminated via publically accessible repositories. However, existing repositories lack sufficient metadata to describe the experiments and are often difficult to navigate by non-experts. The lack of standardized descriptions and semantics of biological assays and screening results hinder targeted data retrieval, integration, aggregation, and analyses across different HTS datasets, for example to infer mechanisms of action of small molecule perturbagens. To address these limitations, we created the BioAssay Ontology (BAO). BAO has been developed with a focus on data integration and analysis enabling the classification of assays and screening results by concepts that relate to format, assay design, technology, target, and endpoint. Previously, we reported on the higher-level design of BAO and on the semantic querying capabilities offered by the ontology-indexed triple store of HTS data. Here, we report on our detailed design, annotation pipeline, substantially enlarged annotation knowledgebase, and analysis results. We used BAO to annotate assays from the largest public HTS data repository, PubChem, and demonstrate its utility to categorize and analyze diverse HTS results from numerous experiments. BAO is publically available from the NCBO BioPortal at http://bioportal.bioontology.org/ontologies/1533. BAO provides controlled terminology and uniform scope to report probe and drug discovery screening assays and results. BAO leverages description logic to formalize the domain knowledge and facilitate the semantic integration with diverse other resources. As a consequence, BAO offers the potential to infer new knowledge from a corpus of assay results, for example molecular mechanisms of action of perturbagens. PMID:23155465

  17. Collecting conditions usage metadata to optimize current and future ATLAS software and processing

    NASA Astrophysics Data System (ADS)

    Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration

    2017-10-01

    Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.

  18. Data files for ab initio calculations of the lattice parameter and elastic stiffness coefficients of bcc Fe with solutes

    DOE PAGES

    Fellinger, Michael R.; Hector, Jr., Louis G.; Trinkle, Dallas R.

    2016-11-29

    Here, we present computed datasets on changes in the lattice parameter and elastic stiffness coefficients of BCC Fe due to substitutional Al, B, Cu, Mn, and Si solutes, and octahedral interstitial C and N solutes. The data is calculated using the methodology based on density functional theory (DFT). All the DFT calculations were performed using the Vienna Ab initio Simulations Package (VASP). The data is stored in the NIST dSpace repository.

  19. Incorporation of Metals into Calcite in a Deep Anoxic Granite Aquifer.

    PubMed

    Drake, Henrik; Mathurin, Frédéric A; Zack, Thomas; Schäfer, Thorsten; Roberts, Nick Mw; Whitehouse, Martin; Karlsson, Andreas; Broman, Curt; Åström, Mats E

    2018-01-16

    Understanding metal scavenging by calcite in deep aquifers in granite is of importance for deciphering and modeling hydrochemical fluctuations and water-rock interaction in the upper crust and for retention mechanisms associated with underground repositories for toxic wastes. Metal scavenging into calcite has generally been established in the laboratory or in natural environments that cannot be unreservedly applied to conditions in deep crystalline rocks, an environment of broad interest for nuclear waste repositories. Here, we report a microanalytical study of calcite precipitated over a period of 17 years from anoxic, low-temperature (14 °C), neutral (pH: 7.4-7.7), and brackish (Cl: 1700-7100 mg/L) groundwater flowing in fractures at >400 m depth in granite rock. This enabled assessment of the trace metal uptake by calcite under these deep-seated conditions. Aquatic speciation modeling was carried out to assess influence of metal complexation on the partitioning into calcite. The resulting environment-specific partition coefficients were for several divalent ions in line with values obtained in controlled laboratory experiments, whereas for several other ions they differed substantially. High absolute uptake of rare earth elements and U(IV) suggests that coprecipitation into calcite can be an important sink for these metals and analogousactinides in the vicinity of geological repositories.

  20. International Approaches for Nuclear Waste Disposal in Geological Formations: Geological Challenges in Radioactive Waste Isolation—Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Sassani, David

    The overall objective of the Fifth Worldwide Review (WWR-5) is to document the current state-of-the-art of major developments in a number of nations throughout the World pursuing geological disposal programs, and to summarize challenging problems and experience that have been obtained in siting, preparing and reviewing cases for the operational and long-term safety of proposed and operating nuclear waste repositories. The scope of the Review is to address current specific technical issues and challenges in safety case development along with the interplay of technical feasibility, siting, engineering design issues, and operational and post-closure safety. In particular, the chapters included inmore » the report present the following types of information: the current status of the deep geological repository programs for high level nuclear waste and low- and intermediate level nuclear waste in each country, concepts of siting and radioactive waste and spent nuclear fuel management in different countries (with the emphasis of nuclear waste disposal under different climatic conditions and different geological formations), progress in repository site selection and site characterization, technology development, buffer/backfill materials studies and testing, support activities, programs, and projects, international cooperation, and future plans, as well as regulatory issues and transboundary problems.« less

  1. 75 FR 26788 - Public Land Order No. 7742; Withdrawal of Public Land for the Manning Canyon Tailings Repository; UT

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-12

    ... 79765] Public Land Order No. 7742; Withdrawal of Public Land for the Manning Canyon Tailings Repository... period of 5 years to protect the integrity of the Manning Canyon Tailings Repository and surrounding... Repository. The Bureau of Land Management intends to evaluate the need for a lengthier withdrawal through the...

  2. The Research Library's Role in Digital Repository Services: Final Report of the ARL Digital Repository Issues Task Force

    ERIC Educational Resources Information Center

    Association of Research Libraries, 2009

    2009-01-01

    Libraries are making diverse contributions to the development of many types of digital repositories, particularly those housing locally created digital content, including new digital objects or digitized versions of locally held works. In some instances, libraries are managing a repository and its related services entirely on their own, but often…

  3. Analysis of Academic Attitudes and Existing Processes to Inform the Design of Teaching and Learning Material Repositories: A User-Centred Approach

    ERIC Educational Resources Information Center

    King, Melanie; Loddington, Steve; Manuel, Sue; Oppenheim, Charles

    2008-01-01

    The last couple of years have brought a rise in the number of institutional repositories throughout the world and within UK Higher Education institutions, with the majority of these repositories being devoted to research output. Repositories containing teaching and learning material are less common and the workflows and business processes…

  4. Integrating XQuery-Enabled SCORM XML Metadata Repositories into an RDF-Based E-Learning P2P Network

    ERIC Educational Resources Information Center

    Qu, Changtao; Nejdl, Wolfgang

    2004-01-01

    Edutella is an RDF-based E-Learning P2P network that is aimed to accommodate heterogeneous learning resource metadata repositories in a P2P manner and further facilitate the exchange of metadata between these repositories based on RDF. Whereas Edutella provides RDF metadata repositories with a quite natural integration approach, XML metadata…

  5. Scaling an expert system data mart: more facilities in real-time.

    PubMed

    McNamee, L A; Launsby, B D; Frisse, M E; Lehmann, R; Ebker, K

    1998-01-01

    Clinical Data Repositories are being rapidly adopted by large healthcare organizations as a method of centralizing and unifying clinical data currently stored in diverse and isolated information systems. Once stored in a clinical data repository, healthcare organizations seek to use this centralized data to store, analyze, interpret, and influence clinical care, quality and outcomes. A recent trend in the repository field has been the adoption of data marts--specialized subsets of enterprise-wide data taken from a larger repository designed specifically to answer highly focused questions. A data mart exploits the data stored in the repository, but can use unique structures or summary statistics generated specifically for an area of study. Thus, data marts benefit from the existence of a repository, are less general than a repository, but provide more effective and efficient support for an enterprise-wide data analysis task. In previous work, we described the use of batch processing for populating data marts directly from legacy systems. In this paper, we describe an architecture that uses both primary data sources and an evolving enterprise-wide clinical data repository to create real-time data sources for a clinical data mart to support highly specialized clinical expert systems.

  6. Schematic designs for penetration seals for a reference repository in bedded salt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelsall, P.C.; Case, J.B.; Meyer, D.

    1982-11-01

    The isolation of radioactive wastes in geologic repositories requires that man-made penetrations such as shafts, tunnels, or boreholes are adequately sealed. This report describes schematic seal designs for a repository in bedded salt referenced to the straitigraphy of southeastern New Mexico. The designs are presented for extensive peer review and will be updated as site-specific conceptual designs when a site for a repository in salt has been selected. The principal material used in the seal system is crushed salt obtained from excavating the repository. It is anticipated that crushed salt will consolidate as the repository rooms creep close to themore » degree that mechanical and hydrologic properties will eventually match those of undisturbed, intact salt. For southeastern New Mexico salt, analyses indicate that this process will require approximately 1000 years for a seal located at the base of one of the repository shafts (where there is little increase in temperature due to waste emplacement) and approximately 400 years for a seal located in an access tunnel within the repository. Bulkheads composed of contrete or salt bricks are also included in the seal system as components which will have low permeability during the period required for salt consolidation.« less

  7. Classification of Clinical Research Study Eligibility Criteria to Support Multi-Stage Cohort Identification Using Clinical Data Repositories.

    PubMed

    Cimino, James J; Lancaster, William J; Wyatt, Mathew C

    2017-01-01

    One of the challenges to using electronic health record (EHR) repositories for research is the difficulty mapping study subject eligibility criteria to the query capabilities of the repository. We sought to characterize criteria as "easy" (searchable in a typical repository), "hard" (requiring manual review of the record data), and "impossible" (not typically available in EHR repositories). We obtained 292 criteria from 20 studies available from Clinical Trials.gov and rated them according to our three types, plus a fourth "mixed" type. We had good agreement among three independent reviewers and chose 274 criteria that were characterized by single types for further analysis. The resulting analysis showed typical features of criteria that do and don't map to repositories. We propose that these features be used to guide researchers in specifying eligibility criteria to improve development of enrollment workflow, including the definition of EHR repository queries for self-service or analyst-mediated retrievals.

  8. Making Research Data Repositories Visible: The re3data.org Registry

    PubMed Central

    Pampel, Heinz; Vierkant, Paul; Scholze, Frank; Bertelmann, Roland; Kindling, Maxi; Klump, Jens; Goebelbecker, Hans-Jürgen; Gundlach, Jens; Schirmbacher, Peter; Dierolf, Uwe

    2013-01-01

    Researchers require infrastructures that ensure a maximum of accessibility, stability and reliability to facilitate working with and sharing of research data. Such infrastructures are being increasingly summarized under the term Research Data Repositories (RDR). The project re3data.org–Registry of Research Data Repositories–has begun to index research data repositories in 2012 and offers researchers, funding organizations, libraries and publishers an overview of the heterogeneous research data repository landscape. In July 2013 re3data.org lists 400 research data repositories and counting. 288 of these are described in detail using the re3data.org vocabulary. Information icons help researchers to easily identify an adequate repository for the storage and reuse of their data. This article describes the heterogeneous RDR landscape and presents a typology of institutional, disciplinary, multidisciplinary and project-specific RDR. Further the article outlines the features of re3data.org, and shows how this registry helps to identify appropriate repositories for storage and search of research data. PMID:24223762

  9. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  10. Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel

    Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.

  11. Iterative performance assessments as a regulatory tool for evaluating repository safety: How experiences from SKI Project-90 were used in formulating the new performance assessment project SITE-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, J.

    1993-12-31

    The Swedish Nuclear Power Inspectorate, SKI, regulatory research program has to prepare for the process of licensing a repository for spent nuclear fuel, by building up the necessary knowledge and review capacity. SKIs main strategy for meeting this demand is to develop an independent performance assessment capability. SKIs first own performance assessment project, Project-90, was completed in 1991 and is now followed by a new project, SITE-94. SITE-94 is based on conclusions reached within Project-90. An independent review of Project-90, carried out by a NEA team of experts, has also contributed to the formation of the project. Another important reasonmore » for the project is that the implementing organization in Sweden, SKB, has proposed to submit an application to start detailed investigation of a repository candidate site around 1997. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process, and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. This will be addressed by exploring multiple interpretations, conceptual models, and parameters consistent with the site data. The site evaluation will strive for consistency between geological, hydrological, rock mechanical, and geochemical descriptions. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties, evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments.« less

  12. Developing a concept for a national used fuel interim storage facility in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Donald Wayne

    2013-07-01

    In the United States (U.S.) the nuclear waste issue has plagued the nuclear industry for decades. Originally, spent fuel was to be reprocessed but with the threat of nuclear proliferation, spent fuel reprocessing has been eliminated, at least for now. In 1983, the Nuclear Waste Policy Act of 1982 [1] was established, authorizing development of one or more spent fuel and high-level nuclear waste geological repositories and a consolidated national storage facility, called a 'Monitored Retrievable Storage' facility, that could store the spent nuclear fuel until it could be placed into the geological repository. Plans were under way to buildmore » a geological repository, Yucca Mountain, but with the decision by President Obama to terminate the development of Yucca Mountain, a consolidated national storage facility that can store spent fuel for an interim period until a new repository is established has become very important. Since reactor sites have not been able to wait for the government to come up with a storage or disposal location, spent fuel remains in wet or dry storage at each nuclear plant. The purpose of this paper is to present a concept developed to address the DOE's goals stated above. This concept was developed over the past few months by collaboration between the DOE and industry experts that have experience in designing spent nuclear fuel facilities. The paper examines the current spent fuel storage conditions at shutdown reactor sites, operating reactor sites, and the type of storage systems (transportable versus non-transportable, welded or bolted). The concept lays out the basis for a pilot storage facility to house spent fuel from shutdown reactor sites and then how the pilot facility can be enlarged to a larger full scale consolidated interim storage facility. (authors)« less

  13. Coupled Biological-Geomechanical-Geochemical Effects of the Disturbed Rock Zone on the Performance of the Waste Isolation Pilot Plant

    NASA Astrophysics Data System (ADS)

    Dunagan, S. C.; Herrick, C. G.; Lee, M. Y.

    2008-12-01

    The Waste Isolation Pilot Plant (WIPP) is located at a depth of 655 m in bedded salt in southeastern New Mexico and is operated by the U.S. Department of Energy as a deep underground disposal facility for transuranic (TRU) waste. The WIPP must comply with the EPA's environmental regulations that require a probabilistic risk analysis of releases of radionuclides due to inadvertent human intrusion into the repository at some time during the 10,000-year regulatory period. Sandia National Laboratories conducts performance assessments (PAs) of the WIPP using a system of computer codes representing the evolution of underground repository and emplaced TRU waste in order to demonstrate compliance. One of the important features modeled in a PA is the disturbed rock zone (DRZ) surrounding the emplacement rooms in the repository. The extent and permeability of DRZ play a significant role in the potential radionuclide release scenarios. We evaluated the phenomena occurring in the repository that affect the DRZ and their potential effects on the extent and permeability of the DRZ. Furthermore, we examined the DRZ's role in determining the performance of the repository. Pressure in the completely sealed repository will be increased by creep closure of the salt and degradation of TRU waste contents by microbial activity in the repository. An increased pressure in the repository will reduce the extent and permeability of the DRZ. The reduced DRZ extent and permeability will decrease the amount of brine that is available to interact with the waste. Furthermore, the potential for radionuclide release from the repository is dependent on the amount of brine that enters the repository. As a result of these coupled biological-geomechanical-geochemical phenomena, the extent and permeability of the DRZ has a significant impact on the potential radionuclide releases from the repository and, in turn, the repository performance. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.

  14. Repository Profiles for Atmospheric and Climate Sciences: Capabilities and Trends in Data Services

    NASA Astrophysics Data System (ADS)

    Hou, C. Y.; Thompson, C. A.; Palmer, C. L.

    2014-12-01

    As digital research data proliferate and expectations for open access escalate, the landscape of data repositories is becoming more complex. For example, DataBib currently identifies 980 data repositories across the disciplines, with 117 categorized under Geosciences. In atmospheric and climate sciences, there are great expectations for the integration and reuse of data for advancing science. To realize this potential, resources are needed that explicate the range of repository options available for locating and depositing open data, their conditions of access and use, and the services and tools they provide. This study profiled 38 open digital repositories in the atmospheric and climate sciences, analyzing each on 55 criteria through content analysis of their websites. The results provide a systematic way to assess and compare capabilities, services, and institutional characteristics and identify trends across repositories. Selected results from the more detailed outcomes to be presented: Most repositories offer guidance on data format(s) for submission and dissemination. 42% offer authorization-free access. More than half use some type of data identification system such as DOIs. Nearly half offer some data processing, with a similar number providing software or tools. 78.9% request that users cite or acknowledge datasets used and the data center. Only 21.1% recommend specific metadata standards, such as ISO 19115 or Dublin Core, with more than half utilizing a customized metadata scheme. Information was rarely provided on repository certification and accreditation and uneven for transfer of rights and data security. Few provided policy information on preservation, migration, reappraisal, disposal, or long-term sustainability. As repository use increases, it will be important for institutions to make their procedures and policies explicit, to build trust with user communities and improve efficiencies in data sharing. Resources such as repository profiles will be essential for scientists to weigh options and understand trends in data services across the evolving network of repositories.

  15. Experiences with the BSCW Shared Workspace System as the Backbone of a Virtual Learning Environment for Students.

    ERIC Educational Resources Information Center

    Appelt, Wolfgang; Mambrey, Peter

    The GMD (German National Research Center for Information Technology) has developed the BSCW (Basic Support for Cooperative Work) Shared Workspace system within the last four years with the goal of transforming the Web from a primarily passive information repository to an active cooperation medium. The BSCW system is a Web-based groupware tool for…

  16. DOS Design/Application Tools System/Segment Specification. Volume 3

    DTIC Science & Technology

    1990-09-01

    consume the same information to obtain that information without "manual" translation by people. Solving the information management problem effectively...and consumes ’ even more information than centralized development. Distributed systems cannot be developed successfully by experiment without...human intervention because all tools consume input from and produce output to the same repository. New tools are easily absorbed into the environment

  17. Java Web Simulation (JWS); a web based database of kinetic models.

    PubMed

    Snoep, J L; Olivier, B G

    2002-01-01

    Software to make a database of kinetic models accessible via the internet has been developed and a core database has been set up at http://jjj.biochem.sun.ac.za/. This repository of models, available to everyone with internet access, opens a whole new way in which we can make our models public. Via the database, a user can change enzyme parameters and run time simulations or steady state analyses. The interface is user friendly and no additional software is necessary. The database currently contains 10 models, but since the generation of the program code to include new models has largely been automated the addition of new models is straightforward and people are invited to submit their models to be included in the database.

  18. Raising orphans from a metadata morass: A researcher's guide to re-use of public 'omics data.

    PubMed

    Bhandary, Priyanka; Seetharam, Arun S; Arendsee, Zebulun W; Hur, Manhoi; Wurtele, Eve Syrkin

    2018-02-01

    More than 15 petabases of raw RNAseq data is now accessible through public repositories. Acquisition of other 'omics data types is expanding, though most lack a centralized archival repository. Data-reuse provides tremendous opportunity to extract new knowledge from existing experiments, and offers a unique opportunity for robust, multi-'omics analyses by merging metadata (information about experimental design, biological samples, protocols) and data from multiple experiments. We illustrate how predictive research can be accelerated by meta-analysis with a study of orphan (species-specific) genes. Computational predictions are critical to infer orphan function because their coding sequences provide very few clues. The metadata in public databases is often confusing; a test case with Zea mays mRNA seq data reveals a high proportion of missing, misleading or incomplete metadata. This metadata morass significantly diminishes the insight that can be extracted from these data. We provide tips for data submitters and users, including specific recommendations to improve metadata quality by more use of controlled vocabulary and by metadata reviews. Finally, we advocate for a unified, straightforward metadata submission and retrieval system. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    NASA Technical Reports Server (NTRS)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  20. Corticotropin, Repository Injection

    MedlinePlus

    Corticotropin repository injection is used to treat the following conditions:infantile spasms (seizures that usually begin during the first ... of the arms, hands, feet, and legs). Corticotropin repository injection is in a class of medications called ...

  1. 10 CFR 60.134 - Design of seals for shafts and boreholes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60... the geologic repository's ability to meet the performance objectives or the period following permanent...

  2. DendroPy: a Python library for phylogenetic computing.

    PubMed

    Sukumaran, Jeet; Holder, Mark T

    2010-06-15

    DendroPy is a cross-platform library for the Python programming language that provides for object-oriented reading, writing, simulation and manipulation of phylogenetic data, with an emphasis on phylogenetic tree operations. DendroPy uses a splits-hash mapping to perform rapid calculations of tree distances, similarities and shape under various metrics. It contains rich simulation routines to generate trees under a number of different phylogenetic and coalescent models. DendroPy's data simulation and manipulation facilities, in conjunction with its support of a broad range of phylogenetic data formats (NEXUS, Newick, PHYLIP, FASTA, NeXML, etc.), allow it to serve a useful role in various phyloinformatics and phylogeographic pipelines. The stable release of the library is available for download and automated installation through the Python Package Index site (http://pypi.python.org/pypi/DendroPy), while the active development source code repository is available to the public from GitHub (http://github.com/jeetsukumaran/DendroPy).

  3. A Control Variate Method for Probabilistic Performance Assessment. Improved Estimates for Mean Performance Quantities of Interest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, Robert J.; Kuhlman, Kristopher L

    2016-05-01

    We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application tomore » probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.« less

  4. Kinetic Modeling using BioPAX ontology

    PubMed Central

    Ruebenacker, Oliver; Moraru, Ion. I.; Schaff, James C.; Blinov, Michael L.

    2010-01-01

    Thousands of biochemical interactions are available for download from curated databases such as Reactome, Pathway Interaction Database and other sources in the Biological Pathways Exchange (BioPAX) format. However, the BioPAX ontology does not encode the necessary information for kinetic modeling and simulation. The current standard for kinetic modeling is the System Biology Markup Language (SBML), but only a small number of models are available in SBML format in public repositories. Additionally, reusing and merging SBML models presents a significant challenge, because often each element has a value only in the context of the given model, and information encoding biological meaning is absent. We describe a software system that enables a variety of operations facilitating the use of BioPAX data to create kinetic models that can be visualized, edited, and simulated using the Virtual Cell (VCell), including improved conversion to SBML (for use with other simulation tools that support this format). PMID:20862270

  5. Hierarchical control and performance evaluation of multi-vehicle autonomous systems

    NASA Astrophysics Data System (ADS)

    Balakirsky, Stephen; Scrapper, Chris; Messina, Elena

    2005-05-01

    This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together

  6. Dynamic system simulation of small satellite projects

    NASA Astrophysics Data System (ADS)

    Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper

    2010-11-01

    A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.

  7. 10 CFR 960.5-2 - Technical guidelines.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Preclosure Guidelines § 960.5-2 Technical guidelines. The technical guidelines in this subpart set... repository and to the transportation of waste to a repository site. The third group includes conditions on...

  8. Introducing the Brassica Information Portal: Towards integrating genotypic and phenotypic Brassica crop data

    PubMed Central

    Eckes, Annemarie H.; Gubała, Tomasz; Nowakowski, Piotr; Szymczyszyn, Tomasz; Wells, Rachel; Irwin, Judith A.; Horro, Carlos; Hancock, John M.; King, Graham; Dyer, Sarah C.; Jurkowski, Wiktor

    2017-01-01

    The Brassica Information Portal (BIP) is a centralised repository for brassica phenotypic data. The site hosts trait data associated with brassica research and breeding experiments conducted on brassica crops, that are used as oilseeds, vegetables, livestock forage and fodder and for biofuels. A key feature is the explicit management of meta-data describing the provenance and relationships between experimental plant materials, as well as trial design and trait descriptors. BIP is an open access and open source project, built on the schema of CropStoreDB, and as such can provide trait data management strategies for any crop data. A new user interface and programmatic submission/retrieval system helps to simplify data access for researchers, breeders and other end-users. BIP opens up the opportunity to apply integrative, cross-project analyses to data generated by the Brassica Research Community. Here, we present a short description of the current status of the repository. PMID:28529710

  9. The Materials Commons: A Collaboration Platform and Information Repository for the Global Materials Community

    NASA Astrophysics Data System (ADS)

    Puchala, Brian; Tarcea, Glenn; Marquis, Emmanuelle. A.; Hedstrom, Margaret; Jagadish, H. V.; Allison, John E.

    2016-08-01

    Accelerating the pace of materials discovery and development requires new approaches and means of collaborating and sharing information. To address this need, we are developing the Materials Commons, a collaboration platform and information repository for use by the structural materials community. The Materials Commons has been designed to be a continuous, seamless part of the scientific workflow process. Researchers upload the results of experiments and computations as they are performed, automatically where possible, along with the provenance information describing the experimental and computational processes. The Materials Commons website provides an easy-to-use interface for uploading and downloading data and data provenance, as well as for searching and sharing data. This paper provides an overview of the Materials Commons. Concepts are also outlined for integrating the Materials Commons with the broader Materials Information Infrastructure that is evolving to support the Materials Genome Initiative.

  10. The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier

    NASA Technical Reports Server (NTRS)

    Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.

    2017-01-01

    The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.

  11. Reconstructing a hydrogen-driven microbial metabolic network in Opalinus Clay rock.

    PubMed

    Bagnoud, Alexandre; Chourey, Karuna; Hettich, Robert L; de Bruijn, Ino; Andersson, Anders F; Leupin, Olivier X; Schwyn, Bernhard; Bernier-Latmani, Rizlan

    2016-10-14

    The Opalinus Clay formation will host geological nuclear waste repositories in Switzerland. It is expected that gas pressure will build-up due to hydrogen production from steel corrosion, jeopardizing the integrity of the engineered barriers. In an in situ experiment located in the Mont Terri Underground Rock Laboratory, we demonstrate that hydrogen is consumed by microorganisms, fuelling a microbial community. Metagenomic binning and metaproteomic analysis of this deep subsurface community reveals a carbon cycle driven by autotrophic hydrogen oxidizers belonging to novel genera. Necromass is then processed by fermenters, followed by complete oxidation to carbon dioxide by heterotrophic sulfate-reducing bacteria, which closes the cycle. This microbial metabolic web can be integrated in the design of geological repositories to reduce pressure build-up. This study shows that Opalinus Clay harbours the potential for chemolithoautotrophic-based system, and provides a model of microbial carbon cycle in deep subsurface environments where hydrogen and sulfate are present.

  12. Reconstructing a hydrogen-driven microbial metabolic network in Opalinus Clay rock

    PubMed Central

    Bagnoud, Alexandre; Chourey, Karuna; Hettich, Robert L.; de Bruijn, Ino; Andersson, Anders F.; Leupin, Olivier X.; Schwyn, Bernhard; Bernier-Latmani, Rizlan

    2016-01-01

    The Opalinus Clay formation will host geological nuclear waste repositories in Switzerland. It is expected that gas pressure will build-up due to hydrogen production from steel corrosion, jeopardizing the integrity of the engineered barriers. In an in situ experiment located in the Mont Terri Underground Rock Laboratory, we demonstrate that hydrogen is consumed by microorganisms, fuelling a microbial community. Metagenomic binning and metaproteomic analysis of this deep subsurface community reveals a carbon cycle driven by autotrophic hydrogen oxidizers belonging to novel genera. Necromass is then processed by fermenters, followed by complete oxidation to carbon dioxide by heterotrophic sulfate-reducing bacteria, which closes the cycle. This microbial metabolic web can be integrated in the design of geological repositories to reduce pressure build-up. This study shows that Opalinus Clay harbours the potential for chemolithoautotrophic-based system, and provides a model of microbial carbon cycle in deep subsurface environments where hydrogen and sulfate are present. PMID:27739431

  13. Assessing repository technology. Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as prospective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  14. Assessing repository technology: Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as perspective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  15. USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15

    DTIC Science & Technology

    2017-05-31

    AFRL-SA-WP-SR-2017-0014 USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 Daniel A. Williams...Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...Health Readiness System-Hearing Conservation Data Repository (DOEHRS-HC DR). Major command- and installation-level reports are available quarterly

  16. Reproducible Research in the Geosciences at Scale: Achievable Goal or Elusive Dream?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.

    2016-12-01

    Reproducibility is a fundamental tenant of the scientific method: it implies that any researcher, or a third party working independently, can duplicate any experiment or investigation and produce the same results. Historically computationally based research involved an individual using their own data and processing it in their own private area, often using software they wrote or inherited from close collaborators. Today, a researcher is likely to be part of a large team that will use a subset of data from an external repository and then process the data on a public or private cloud or on a large centralised supercomputer, using a mixture of their own code, third party software and libraries, or global community codes. In 'Big Geoscience' research it is common for data inputs to be extracts from externally managed dynamic data collections, where new data is being regularly appended, or existing data is revised when errors are detected and/or as processing methods are improved. New workflows increasingly use services to access data dynamically to create subsets on-the-fly from distributed sources, each of which can have a complex history. At major computational facilities, underlying systems, libraries, software and services are being constantly tuned and optimised, or as new or replacement infrastructure being installed. Likewise code used from a community repository is continually being refined, re-packaged and ported to the target platform. To achieve reproducibility, today's researcher increasingly needs to track their workflow, including querying information on the current or historical state of facilities used. Versioning methods are standard practice for software repositories or packages, but it is not common for either data repositories or data services to provide information about their state, or for systems to provide query-able access to changes in the underlying software. While a researcher can achieve transparency and describe steps in their workflow so that others can repeat them and replicate processes undertaken, they cannot achieve exact reproducibility or even transparency of results generated. In Big Geoscience, full reproducibiliy will be an elusive dream until data repositories and compute facilities can provide provenance information in a standards compliant, machine query-able way.

  17. Facilitating the selection and creation of accurate interatomic potentials with robust tools and characterization

    NASA Astrophysics Data System (ADS)

    Trautt, Zachary T.; Tavazza, Francesca; Becker, Chandler A.

    2015-10-01

    The Materials Genome Initiative seeks to significantly decrease the cost and time of development and integration of new materials. Within the domain of atomistic simulations, several roadblocks stand in the way of reaching this goal. While the NIST Interatomic Potentials Repository hosts numerous interatomic potentials (force fields), researchers cannot immediately determine the best choice(s) for their use case. Researchers developing new potentials, specifically those in restricted environments, lack a comprehensive portfolio of efficient tools capable of calculating and archiving the properties of their potentials. This paper elucidates one solution to these problems, which uses Python-based scripts that are suitable for rapid property evaluation and human knowledge transfer. Calculation results are visible on the repository website, which reduces the time required to select an interatomic potential for a specific use case. Furthermore, property evaluation scripts are being integrated with modern platforms to improve discoverability and access of materials property data. To demonstrate these scripts and features, we will discuss the automation of stacking fault energy calculations and their application to additional elements. While the calculation methodology was developed previously, we are using it here as a case study in simulation automation and property calculations. We demonstrate how the use of Python scripts allows for rapid calculation in a more easily managed way where the calculations can be modified, and the results presented in user-friendly and concise ways. Additionally, the methods can be incorporated into other efforts, such as openKIM.

  18. 10 CFR 960.3-4 - Environmental impacts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-4 Environmental impacts. Environmental impacts shall be considered by the DOE throughout the site characterization, site selection, and repository development..., during site characterization and repository construction, operation, closure, and decommissioning. ...

  19. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-22

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequentmore » to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.« less

  20. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine.

    PubMed

    Christodoulou, Nikolaos A; Tousert, Nikolaos E; Georgiadi, Eleni Ch; Argyri, Katerina D; Misichroni, Fay D; Stamatakos, Georgios S

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice - following their clinical validation - have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features.

  1. A Modular Repository-based Infrastructure for Simulation Model Storage and Execution Support in the Context of In Silico Oncology and In Silico Medicine

    PubMed Central

    Christodoulou, Nikolaos A.; Tousert, Nikolaos E.; Georgiadi, Eleni Ch.; Argyri, Katerina D.; Misichroni, Fay D.; Stamatakos, Georgios S.

    2016-01-01

    The plethora of available disease prediction models and the ongoing process of their application into clinical practice – following their clinical validation – have created new needs regarding their efficient handling and exploitation. Consolidation of software implementations, descriptive information, and supportive tools in a single place, offering persistent storage as well as proper management of execution results, is a priority, especially with respect to the needs of large healthcare providers. At the same time, modelers should be able to access these storage facilities under special rights, in order to upgrade and maintain their work. In addition, the end users should be provided with all the necessary interfaces for model execution and effortless result retrieval. We therefore propose a software infrastructure, based on a tool, model and data repository that handles the storage of models and pertinent execution-related data, along with functionalities for execution management, communication with third-party applications, user-friendly interfaces to access and use the infrastructure with minimal effort and basic security features. PMID:27812280

  2. Actinide Solubility and Speciation in the WIPP [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, Donald T.

    2015-11-02

    The presentation begins with the role and need for nuclear repositories (overall concept, international updates (Sweden, Finland, France, China), US approach and current status), then moves on to the WIPP TRU repository concept (design, current status--safety incidents of February 5 and 14, 2014, path forward), and finally considers the WIPP safety case: dissolved actinide concentrations (overall approach, oxidation state distribution and redox control, solubility of actinides, colloidal contribution and microbial effects). The following conclusions are set forth: (1) International programs are moving forward, but at a very slow and somewhat sporadic pace. (2) In the United States, the Salt repositorymore » concept, from the perspective of the long-term safety case, remains a viable option for nuclear waste management despite the current operational issues/concerns. (3) Current model/PA prediction (WIPP example) are built on redundant conservatisms. These conservatisms are being addressed in the ongoing and future research to fill existing data gaps--redox control of plutonium by Fe(0, II), thorium (analog) solubility studies in simulated brine, contribution of intrinsic and biocolloids to the mobile concentration, and clarification of microbial ecology and effects.« less

  3. WholeCellSimDB: a hybrid relational/HDF database for whole-cell model predictions.

    PubMed

    Karr, Jonathan R; Phillips, Nolan C; Covert, Markus W

    2014-01-01

    Mechanistic 'whole-cell' models are needed to develop a complete understanding of cell physiology. However, extracting biological insights from whole-cell models requires running and analyzing large numbers of simulations. We developed WholeCellSimDB, a database for organizing whole-cell simulations. WholeCellSimDB was designed to enable researchers to search simulation metadata to identify simulations for further analysis, and quickly slice and aggregate simulation results data. In addition, WholeCellSimDB enables users to share simulations with the broader research community. The database uses a hybrid relational/hierarchical data format architecture to efficiently store and retrieve both simulation setup metadata and results data. WholeCellSimDB provides a graphical Web-based interface to search, browse, plot and export simulations; a JavaScript Object Notation (JSON) Web service to retrieve data for Web-based visualizations; a command-line interface to deposit simulations; and a Python API to retrieve data for advanced analysis. Overall, we believe WholeCellSimDB will help researchers use whole-cell models to advance basic biological science and bioengineering. http://www.wholecellsimdb.org SOURCE CODE REPOSITORY: URL: http://github.com/CovertLab/WholeCellSimDB. © The Author(s) 2014. Published by Oxford University Press.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrada, J.J.

    This report compiles preliminary information that supports the premise that a repository is needed in Latin America and analyzes the nuclear situation (mainly in Argentina and Brazil) in terms of nuclear capabilities, inventories, and regional spent-fuel repositories. The report is based on several sources and summarizes (1) the nuclear capabilities in Latin America and establishes the framework for the need of a permanent repository, (2) the International Atomic Energy Agency (IAEA) approach for a regional spent-fuel repository and describes the support that international institutions are lending to this issue, (3) the current situation in Argentina in order to analyze themore » Argentinean willingness to find a location for a deep geological repository, and (4) the issues involved in selecting a location for the repository and identifies a potential location. This report then draws conclusions based on an analysis of this information. The focus of this report is mainly on spent fuel and does not elaborate on other radiological waste sources.« less

  5. Influence analysis of Github repositories.

    PubMed

    Hu, Yan; Zhang, Jun; Bai, Xiaomei; Yu, Shuo; Yang, Zhuo

    2016-01-01

    With the support of cloud computing techniques, social coding platforms have changed the style of software development. Github is now the most popular social coding platform and project hosting service. Software developers of various levels keep entering Github, and use Github to save their public and private software projects. The large amounts of software developers and software repositories on Github are posing new challenges to the world of software engineering. This paper tries to tackle one of the important problems: analyzing the importance and influence of Github repositories. We proposed a HITS based influence analysis on graphs that represent the star relationship between Github users and repositories. A weighted version of HITS is applied to the overall star graph, and generates a different set of top influential repositories other than the results from standard version of HITS algorithm. We also conduct the influential analysis on per-month star graph, and study the monthly influence ranking of top repositories.

  6. Biological Web Service Repositories Review

    PubMed Central

    Urdidiales‐Nieto, David; Navas‐Delgado, Ismael

    2016-01-01

    Abstract Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re‐execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re‐check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well‐known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. PMID:27783459

  7. Creation of Data Repositories to Advance Nursing Science.

    PubMed

    Perazzo, Joseph; Rodriguez, Margaret; Currie, Jackson; Salata, Robert; Webel, Allison R

    2017-12-01

    Data repositories are a strategy in line with precision medicine and big data initiatives, and are an efficient way to maximize data utility and form collaborative research relationships. Nurse researchers are uniquely positioned to make a valuable contribution using this strategy. The purpose of this article is to present a review of the benefits and challenges associated with developing data repositories, and to describe the process we used to develop and maintain a data repository in HIV research. Systematic planning, data collection, synthesis, and data sharing have enabled us to conduct robust cross-sectional and longitudinal analyses with more than 200 people living with HIV. Our repository building has also led to collaboration and training, both in and out of our organization. We present a pragmatic and affordable way that nurse scientists can build and maintain a data repository, helping us continue to make to our understanding of health phenomena.

  8. Classifying Clinical Trial Eligibility Criteria to Facilitate Phased Cohort Identification Using Clinical Data Repositories.

    PubMed

    Wang, Amy Y; Lancaster, William J; Wyatt, Matthew C; Rasmussen, Luke V; Fort, Daniel G; Cimino, James J

    2017-01-01

    A major challenge in using electronic health record repositories for research is the difficulty matching subject eligibility criteria to query capabilities of the repositories. We propose categories for study criteria corresponding to the effort needed for querying those criteria: "easy" (supporting automated queries), mixed (initial automated querying with manual review), "hard" (fully manual record review), and "impossible" or "point of enrollment" (not typically in health repositories). We obtained a sample of 292 criteria from 20 studies from ClinicalTrials.gov. Six independent reviewers, three each from two academic research institutions, rated criteria according to our four types. We observed high interrater reliability both within and between institutions. The analysis demonstrated typical features of criteria that map with varying levels of difficulty to repositories. We propose using these features to improve enrollment workflow through more standardized study criteria, self-service repository queries, and analyst-mediated retrievals.

  9. Space Telecommunications Radio System (STRS) Application Repository Design and Analysis

    NASA Technical Reports Server (NTRS)

    Handler, Louis M.

    2013-01-01

    The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.

  10. [The subject repositories of strategy of the Open Access initiative].

    PubMed

    Soares Guimarães, M C; da Silva, C H; Horsth Noronha, I

    2012-11-01

    The subject repositories are defined as a set of digital objects resulting from the research related to a specific disciplinary field and occupy a still restricted space in the discussion agenda of the Free Access Movement when compared to amplitude reached in the discussion of Institutional Repositories. Although the Subject Repository comes to prominence in the field, especially for the success of initiatives such as the arXiv, PubMed and E-prints, the literature on the subject is recognized as very limited. Despite its roots in the Library and Information Science, and focus on the management of disciplinary collections (subject area literature), there is little information available about the development and management of subject repositories. The following text seeks to make a brief summary on the topic as a way to present the potential to develop subject repositories in order to strengthen the initiative of open access.

  11. Classifying Clinical Trial Eligibility Criteria to Facilitate Phased Cohort Identification Using Clinical Data Repositories

    PubMed Central

    Wang, Amy Y.; Lancaster, William J.; Wyatt, Matthew C.; Rasmussen, Luke V.; Fort, Daniel G.; Cimino, James J.

    2017-01-01

    A major challenge in using electronic health record repositories for research is the difficulty matching subject eligibility criteria to query capabilities of the repositories. We propose categories for study criteria corresponding to the effort needed for querying those criteria: “easy” (supporting automated queries), mixed (initial automated querying with manual review), “hard” (fully manual record review), and “impossible” or “point of enrollment” (not typically in health repositories). We obtained a sample of 292 criteria from 20 studies from ClinicalTrials.gov. Six independent reviewers, three each from two academic research institutions, rated criteria according to our four types. We observed high interrater reliability both within and between institutions. The analysis demonstrated typical features of criteria that map with varying levels of difficulty to repositories. We propose using these features to improve enrollment workflow through more standardized study criteria, self-service repository queries, and analyst-mediated retrievals. PMID:29854246

  12. Consumption and diffusion of dissolved oxygen in sedimentary rocks.

    PubMed

    Manaka, M; Takeda, M

    2016-10-01

    Fe(II)-bearing minerals (e.g., biotite, chlorite, and pyrite) are a promising reducing agent for the consumption of atmospheric oxygen in repositories for the geological disposal of high-level radioactive waste. To estimate effective diffusion coefficients (D e , in m 2 s -1 ) for dissolved oxygen (DO) and the reaction rates for the oxidation of Fe(II)-bearing minerals in a repository environment, we conducted diffusion-chemical reaction experiments using intact rock samples of Mizunami sedimentary rock. In addition, we conducted batch experiments on the oxidation of crushed sedimentary rock by DO in a closed system. From the results of the diffusion-chemical reaction experiments, we estimated the values of D e for DO to lie within the range 2.69×10 -11

  13. A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.

    2013-07-01

    The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less

  14. Implementation Strategies for Large-Scale Transport Simulations Using Time Domain Particle Tracking

    NASA Astrophysics Data System (ADS)

    Painter, S.; Cvetkovic, V.; Mancillas, J.; Selroos, J.

    2008-12-01

    Time domain particle tracking is an emerging alternative to the conventional random walk particle tracking algorithm. With time domain particle tracking, particles are moved from node to node on one-dimensional pathways defined by streamlines of the groundwater flow field or by discrete subsurface features. The time to complete each deterministic segment is sampled from residence time distributions that include the effects of advection, longitudinal dispersion, a variety of kinetically controlled retention (sorption) processes, linear transformation, and temporal changes in groundwater velocities and sorption parameters. The simulation results in a set of arrival times at a monitoring location that can be post-processed with a kernel method to construct mass discharge (breakthrough) versus time. Implementation strategies differ for discrete flow (fractured media) systems and continuous porous media systems. The implementation strategy also depends on the scale at which hydraulic property heterogeneity is represented in the supporting flow model. For flow models that explicitly represent discrete features (e.g., discrete fracture networks), the sampling of residence times along segments is conceptually straightforward. For continuous porous media, such sampling needs to be related to the Lagrangian velocity field. Analytical or semi-analytical methods may be used to approximate the Lagrangian segment velocity distributions in aquifers with low-to-moderate variability, thereby capturing transport effects of subgrid velocity variability. If variability in hydraulic properties is large, however, Lagrangian velocity distributions are difficult to characterize and numerical simulations are required; in particular, numerical simulations are likely to be required for estimating the velocity integral scale as a basis for advective segment distributions. Aquifers with evolving heterogeneity scales present additional challenges. Large-scale simulations of radionuclide transport at two potential repository sites for high-level radioactive waste will be used to demonstrate the potential of the method. The simulations considered approximately 1000 source locations, multiple radionuclides with contrasting sorption properties, and abrupt changes in groundwater velocity associated with future glacial scenarios. Transport pathways linking the source locations to the accessible environment were extracted from discrete feature flow models that include detailed representations of the repository construction (tunnels, shafts, and emplacement boreholes) embedded in stochastically generated fracture networks. Acknowledgment The authors are grateful to SwRI Advisory Committee for Research, the Swedish Nuclear Fuel and Waste Management Company, and Posiva Oy for financial support.

  15. Quantification of Cation Sorption to Engineered Barrier Materials Under Extreme Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Brian; Schlautman, Mark; Rao, Linfeng

    The objective of this research is to examine mechanisms and thermodynamics of actinide sorption to engineered barrier materials (iron (oxyhydr)oxides and bentonite clay) for nuclear waste repositories under high temperature and high ionic strength conditions using a suite of macroscopic and microscopic techniques which will be coupled with interfacial reaction models. Gaining a mechanistic understanding of interfacial processes governing the sorption/sequestration of actinides at mineral-water interfaces is fundamental for the accurate prediction of actinide behavior in waste repositories. Although macroscale sorption data and various spectroscopic techniques have provided valuable information regarding speciation of actinides at solid-water interfaces, significant knowledge gapsmore » still exist with respect to sorption mechanisms and the ability to quantify sorption, particularly at high temperatures and ionic strengths. This objective is addressed through three major tasks: (1) influence of oxidation state on actinide sorption to iron oxides and clay minerals at elevated temperatures and ionic strengths; (2) calorimetric titrations of actinide-mineral suspensions; (3) evaluation of bentonite performance under repository conditions. The results of the work will include a qualitative conceptual model and a quantitative thermodynamic speciation model describing actinide partitioning to minerals and sediments, which is based upon a mechanistic understanding of specific sorption processes as determined from both micro-scale and macroscale experimental techniques. The speciation model will be a thermodynamic aqueous and surface complexation model of actinide interactions with mineral surfaces that is self-consistent with macroscopic batch sorption data, calorimetric and potentiometric titrations, X-ray absorption Spectroscopy (XAS, mainly Extended X-ray Absorption Fine Structure (EXAFS)), and electron microscopy analyses. The novelty of the proposed work lies largely in the unique system conditions which will be examined (i.e. elevated temperature and ionic strength) and the manner in which the surface complexation model will be developed in terms of specific surface species identified using XAS. These experiments will thus provide a fundamental understanding of the chemical and physical processes occurring at the solid-solution interface under expected repository conditions. Additionally, the focus on thermodynamic treatment of actinide ion interactions with minerals as proposed will provide information on the driving forces involved and contribute to the overall understanding of the high affinity many actinide ions have for oxide surfaces. The utility of this model will be demonstrated in this work through a series of advective and diffusive flow experiments.« less

  16. Effects of Heat Generation on Nuclear Waste Disposal in Salt

    NASA Astrophysics Data System (ADS)

    Clayton, D. J.

    2008-12-01

    Disposal of nuclear waste in salt is an established technology, as evidenced by the successful operations of the Waste Isolation Pilot Plant (WIPP) since 1999. The WIPP is located in bedded salt in southeastern New Mexico and is a deep underground facility for transuranic (TRU) nuclear waste disposal. There are many advantages for placing radioactive wastes in a geologic bedded-salt environment. One desirable mechanical characteristic of salt is that it flows plastically with time ("creeps"). The rate of salt creep is a strong function of temperature and stress differences. Higher temperatures and deviatoric stresses increase the creep rate. As the salt creeps, induced fractures may be closed and eventually healed, which then effectively seals the waste in place. With a backfill of crushed salt emplaced around the waste, the salt creep can cause the crushed salt to reconsolidate and heal to a state similar to intact salt, serving as an efficient seal. Experiments in the WIPP were conducted to investigate the effects of heat generation on the important phenomena and processes in and around the repository (Munson et al. 1987; 1990; 1992a; 1992b). Brine migration towards the heaters was induced from the thermal gradient, while salt creep rates showed an exponential dependence on temperature. The project "Backfill and Material Behavior in Underground Salt Repositories, Phase II" (BAMBUS II) studied the crushed salt backfill and material behavior with heat generation at the Asse mine located near Remlingen, Germany (Bechthold et al. 2004). Increased salt creep rates and significant reconsolidation of the crushed salt were observed at the termination of the experiment. Using the data provided from both projects, exploratory modeling of the thermal-mechanical response of salt has been conducted with varying thermal loading and waste spacing. Increased thermal loading and decreased waste spacing drive the system to higher temperatures, while both factors are desired to reduce costs, as well as decrease the overall footprint of the repository. Higher temperatures increase the rate of salt creep which then effectively seals the waste quicker. Data of the thermal-mechanical response of salt at these higher temperatures is needed to further validate the exploratory modeling and provide meaningful constraints on the repository design. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000.

  17. New Features of the re3data Registry of Research Data Repositories

    NASA Astrophysics Data System (ADS)

    Elger, K.; Pampel, H.; Vierkant, P.; Witt, M.

    2016-12-01

    re3data is a registry of research data repositories that lists over 1,600 repositories from around the world, making it the largest and most comprehensive online catalog of data repositories on the web. The registry offers researchers, funding agencies, libraries and publishers a comprehensive overview of the heterogeneous landscape of data repositories. The repositories are described, following the "Metadata Schema for the Description of Research Data Repositories". re3data summarises the properties of a repository into a user-friendly icon system helping users to easily identify an adequate repository for the storage of their datasets. The re3data entries are curated by an international, multi-disciplinary editorial board. An application programming interface (API) enables other information systems to list and fetch metadata for integration and interoperability. Funders like the European Commission (2015) and publishers like Springer Nature (2016) recommend the use of re3data.org in their policies. The original re3data project partners are the GFZ German Research Centre for Geosciences, the Humboldt-Universität zu Berlin, the Purdue University Libraries and the Karlsruhe Institute of Technology (KIT). Since 2015 re3data is operated as a service of DataCite, a global non-profit organisation that provides persistent identifiers (DOIs) for research data. At the 2016 AGU Fall Meeting we will describe the current status of re3data. An overview of the major developments and new features will be given. Furthermore, we will present our plans to increase the quality of the re3data entries.

  18. 10 CFR 960.5-2-3 - Meteorology.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Preclosure Guidelines Preclosure Radiological Safety § 960.5-2-3 Meteorology. (a) Qualifying condition. The site shall be located such that expected meteorological conditions during repository.... Prevailing meteorological conditions such that any radioactive releases to the atmosphere during repository...

  19. Federated queries of clinical data repositories: the sum of the parts does not equal the whole

    PubMed Central

    Weber, Griffin M

    2013-01-01

    Background and objective In 2008 we developed a shared health research information network (SHRINE), which for the first time enabled research queries across the full patient populations of four Boston hospitals. It uses a federated architecture, where each hospital returns only the aggregate count of the number of patients who match a query. This allows hospitals to retain control over their local databases and comply with federal and state privacy laws. However, because patients may receive care from multiple hospitals, the result of a federated query might differ from what the result would be if the query were run against a single central repository. This paper describes the situations when this happens and presents a technique for correcting these errors. Methods We use a one-time process of identifying which patients have data in multiple repositories by comparing one-way hash values of patient demographics. This enables us to partition the local databases such that all patients within a given partition have data at the same subset of hospitals. Federated queries are then run separately on each partition independently, and the combined results are presented to the user. Results Using theoretical bounds and simulated hospital networks, we demonstrate that once the partitions are made, SHRINE can produce more precise estimates of the number of patients matching a query. Conclusions Uncertainty in the overlap of patient populations across hospitals limits the effectiveness of SHRINE and other federated query tools. Our technique reduces this uncertainty while retaining an aggregate federated architecture. PMID:23349080

  20. A Prototype Performance Assessment Model for Generic Deep Borehole Repository for High-Level Nuclear Waste - 12132

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Arnold, Bill W.; Swift, Peter N.

    2012-07-01

    A deep borehole repository is one of the four geologic disposal system options currently under study by the U.S. DOE to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic deep borehole repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a deep borehole. A prototype performance assessment model for a generic deep borehole repository has been developed using themore » approach for a mined geological repository. The preliminary results from the simplified deep borehole generic repository performance assessment indicate that soluble, non-sorbing (or weakly sorbing) fission product radionuclides, such as I-129, Se-79 and Cl-36, are the likely major dose contributors, and that the annual radiation doses to hypothetical future humans associated with those releases may be extremely small. While much work needs to be done to validate the model assumptions and parameters, these preliminary results highlight the importance of a robust seal design in assuring long-term isolation, and suggest that deep boreholes may be a viable alternative to mined repositories for disposal of both HLW and UNF. (authors)« less

  1. 10 CFR 960.5-2-5 - Environmental quality.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Preclosure Guidelines Environment, Socioeconomics, and Transportation § 960.5-2-5 Environmental... repository siting, construction, operation, closure, and decommissioning, and projected environmental impacts... of the repository or its support facilities on, a component of the National Park System, the National...

  2. 10 CFR 60.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General..., special nuclear, and byproduct material at a geologic repository operations area sited, constructed, or... at a geologic repository operations area sited, constructed, or operated at Yucca Mountain, Nevada...

  3. 10 CFR 60.31 - Construction authorization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Licenses Construction Authorization § 60.31 Construction authorization. Upon review and... in a geologic repository operations area of the design proposed without unreasonable risk to the...: (1) DOE has described the proposed geologic repository including but not limited to: (i) The geologic...

  4. 10 CFR 960.5-2-6 - Socioeconomic impacts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Preclosure Guidelines Environment, Socioeconomics, and Transportation § 960.5-2-6 Socioeconomic... and/or economic impacts induced in communities and surrounding regions by repository siting... significant repository-related impacts on community services, housing supply and demand, and the finances of...

  5. 10 CFR 60.15 - Site characterization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses... the geologic repository to the extent practical. (2) The number of exploratory boreholes and shafts... characterization. (3) To the extent practical, exploratory boreholes and shafts in the geologic repository...

  6. 10 CFR 960.5-1 - System guidelines.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Preclosure Guidelines § 960.5-1 System guidelines. (a) Qualifying conditions—(1) Preclosure... radioactive materials to restricted and unrestricted areas during repository operation and closure shall meet... repository siting, construction, operation, closure, and decommissioning the public and the environment shall...

  7. Air and groundwater flow at the interface between fractured host rock and a bentonite buffer

    NASA Astrophysics Data System (ADS)

    Dessirier, B.; Jarsjo, J.; Frampton, A.

    2014-12-01

    Designs of deep geological repositories for spent nuclear fuel include several levels of confinement. The Swedish and Finnish concept KBS-3 targets for example sparsely fractured crystalline bedrock as host formation and would have the waste canisters embedded in an engineered buffer of compacted MX-80 bentonite. The host rock is a highly heterogeneous dual porosity material containing fractures and a rock matrix. Bentonite is a complex expansive porous material. Its water content and mechanical properties are interdependent. Beyond the specific physics of unsaturated flow and transport in each medium, the interface between them is critical. Detailed knowledge of the transitory two-phase flow regime, induced by the insertion of the unsaturated buffer in a saturated rock environment, is necessary to assess the performance of planned KBS-3 deposition holes. A set of numerical simulations based on the equations of two-phase flow for water and air in porous media were conducted to investigate the dynamics of air and groundwater flow near the rock/bentonite interface in the period following installation of the unsaturated bentonite buffer. We assume state of the two-phase flow parameter values for bentonite from laboratory water uptake tests and typical fracture and rock properties from the Äspö Hard rock laboratory (Sweden) gathered under several field characterization campaigns. The results point to desaturation of the rock domain as far as 10 cm away from the interface into matrix-dominated regions for up to 160 days. Similar observations were made during the Bentonite Rock Interaction Experiment (BRIE) at the Äspö HRL, with a desaturation sustained for even longer times. More than the mere time to mechanical and hydraulic equilibrium, the occurrence of sustained unsaturated conditions opens the possibility for biogeochemical processes that could be critical in the safety assessment of the planned repository.

  8. A mass storage system for supercomputers based on Unix

    NASA Technical Reports Server (NTRS)

    Richards, J.; Kummell, T.; Zarlengo, D. G.

    1988-01-01

    The authors present the design, implementation, and utilization of a large mass storage subsystem (MSS) for the numerical aerodynamics simulation. The MSS supports a large networked, multivendor Unix-based supercomputing facility. The MSS at Ames Research Center provides all processors on the numerical aerodynamics system processing network, from workstations to supercomputers, the ability to store large amounts of data in a highly accessible, long-term repository. The MSS uses Unix System V and is capable of storing hundreds of thousands of files ranging from a few bytes to 2 Gb in size.

  9. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  10. Norm - contaminated iodine production facilities decommissioning in Turkmenistan: experience and results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelbutovskiy, Alexander; Cheremisin, Peter; Egorov, Alexander

    2013-07-01

    This report summarizes the data, including the cost parameters of the former iodine production facilities decommissioning project in Turkmenistan. Before the closure, these facilities were producing the iodine from the underground mineral water by the methods of charcoal adsorption. Balkanabat iodine and Khazar chemical plants' sites remediation, transportation and disposal campaigns main results could be seen. The rehabilitated area covers 47.5 thousand square meters. The remediation equipment main characteristics, technical solutions and rehabilitation operations performed are indicated also. The report shows the types of the waste shipping containers, the quantity and nature of the logistics operations. The project waste turnovermore » is about 2 million ton-kilometers. The problems encountered during the remediation of the Khazar chemical plant site are discussed: undetected waste quantities that were discovered during the operational activities required the additional volume of the disposal facility. The additional repository wall superstructure was designed and erected to accommodate this additional waste. There are data on the volume and characteristics of the NORM waste disposed: 60.4 thousand cu.m. of NORM with total activity 1 439 x 10{sup 9} Bq (38.89 Ci) were disposed at all. This report summarizes the project implementation results, from 2009 to 15.02.2012 (the date of the repository closure and its placement under the controlled supervision), including monitoring results within a year after the repository closure. (authors)« less

  11. Permanent Disposal of Nuclear Waste in Salt

    NASA Astrophysics Data System (ADS)

    Hansen, F. D.

    2016-12-01

    Salt formations hold promise for eternal removal of nuclear waste from our biosphere. Germany and the United States have ample salt formations for this purpose, ranging from flat-bedded formations to geologically mature dome structures. Both nations are revisiting nuclear waste disposal options, accompanied by extensive collaboration on applied salt repository research, design, and operation. Salt formations provide isolation while geotechnical barriers reestablish impermeability after waste is placed in the geology. Between excavation and closure, physical, mechanical, thermal, chemical, and hydrological processes ensue. Salt response over a range of stress and temperature has been characterized for decades. Research practices employ refined test techniques and controls, which improve parameter assessment for features of the constitutive models. Extraordinary computational capabilities require exacting understanding of laboratory measurements and objective interpretation of modeling results. A repository for heat-generative nuclear waste provides an engineering challenge beyond common experience. Long-term evolution of the underground setting is precluded from direct observation or measurement. Therefore, analogues and modeling predictions are necessary to establish enduring safety functions. A strong case for granular salt reconsolidation and a focused research agenda support salt repository concepts that include safety-by-design. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. Author: F. D. Hansen, Sandia National Laboratories

  12. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    PubMed

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  13. ROSA P : The National Transportation Library’s Repository and Open Science Access Portal

    DOT National Transportation Integrated Search

    2018-01-01

    The National Transportation Library (NTL) was founded as an all-digital repository of US DOT research reports, technical publications and data products. NTLs primary public offering is ROSA P, the Repository and Open Science Access Portal. An open...

  14. 10 CFR 960.3-3 - Consultation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY..., operation, closure, decommissioning, licensing, or regulation of a repository. Written responses to written... purpose of determining the suitability of such area for the development of a repository, the DOE shall...

  15. 10 CFR 60.32 - Conditions of construction authorization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Licenses Construction Authorization § 60.32 Conditions of construction authorization... changes to the features of the geologic repository and the procedures authorized. The restrictions that... setting as well as measures related to the design and construction of the geologic repository operations...

  16. An Automated Acquisition System for Media Exploitation

    DTIC Science & Technology

    2008-06-01

    on the acquisition station, AcqMan will pull out the SHA256 image hash, and the device’s model, serial number, and manufacturer. 2. Query the ADOMEX...Repository Using the data collected above, AcqMan will query the ADOMEX repository. The ADOMEX repository will respond to the query with the SHA256 ’s of...whose SHA256s do not match. The last category will be a list of images that the ADOMEX repository already has and that the acquisition station can

  17. Case for retrievable high-level nuclear waste disposal

    USGS Publications Warehouse

    Roseboom, Eugene H.

    1994-01-01

    Plans for the nation's first high-level nuclear waste repository have called for permanently closing and sealing the repository soon after it is filled. However, the hydrologic environment of the proposed site at Yucca Mountain, Nevada, should allow the repository to be kept open and the waste retrievable indefinitely. This would allow direct monitoring of the repository and maintain the options for future generations to improve upon the disposal methods or use the uranium in the spent fuel as an energy resource.

  18. Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.

    2016-12-01

    There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.

  19. Numerical Modeling Tools for the Prediction of Solution Migration Applicable to Mining Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martell, M.; Vaughn, P.

    1999-01-06

    Mining has always had an important influence on cultures and traditions of communities around the globe and throughout history. Today, because mining legislation places heavy emphasis on environmental protection, there is great interest in having a comprehensive understanding of ancient mining and mining sites. Multi-disciplinary approaches (i.e., Pb isotopes as tracers) are being used to explore the distribution of metals in natural environments. Another successful approach is to model solution migration numerically. A proven method to simulate solution migration in natural rock salt has been applied to project through time for 10,000 years the system performance and solution concentrations surroundingmore » a proposed nuclear waste repository. This capability is readily adaptable to simulate solution migration around mining.« less

  20. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M; Russell Eibling, R; David Koopman, D

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less

  1. Benefits of International Collaboration on the International Space Station

    NASA Technical Reports Server (NTRS)

    Hasbrook, Pete; Robinson, Julie A.; Cohen, Luchino; Marcil, Isabelle; De Parolis, Lina; Hatton, Jason; Shirakawa, Masaki; Karabadzhak, Georgy; Sorokin, Igor V.; Valentini, Giovanni

    2017-01-01

    The International Space Station is a valuable platform for research in space, but the benefits are limited if research is only conducted by individual countries. Through the efforts of the ISS Program Science Forum, international science working groups, and interagency cooperation, international collaboration on the ISS has expanded as ISS utilization has matured. Members of science teams benefit from working with counterparts in other countries. Scientists and institutions bring years of experience and specialized expertise to collaborative investigations, leading to new perspectives and approaches to scientific challenges. Combining new ideas and historical results brings synergy and improved peer-reviewed scientific methods and results. World-class research facilities can be expensive and logistically complicated, jeopardizing their full utilization. Experiments that would be prohibitively expensive for a single country can be achieved through contributions of resources from two or more countries, such as crew time, up- and down mass, and experiment hardware. Cooperation also avoids duplication of experiments and hardware among agencies. Biomedical experiments can be completed earlier if astronauts or cosmonauts from multiple agencies participate. Countries responding to natural disasters benefit from ISS imagery assets, even if the country has no space agency of its own. Students around the world participate in ISS educational opportunities, and work with students in other countries, through open curriculum packages and through international competitions. Even experiments conducted by a single country can benefit scientists around the world, through specimen sharing programs and publicly accessible "open data" repositories. For ISS data, these repositories include GeneLab, the Physical Science Informatics System, and different Earth data systems. Scientists can conduct new research using ISS data without having to launch and execute their own experiments. Multilateral collections of research results publications, maintained by the ISS international partnership and accessible via nasa.gov, make ISS results available worldwide, and encourage new users, ideas and research.

  2. 10 CFR 960.4 - Postclosure guidelines.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Postclosure Guidelines § 960.4 Postclosure guidelines. The guidelines in this subpart specify the factors to be considered in evaluating and comparing sites on the basis of expected repository performance... NRC and EPA regulations. These requirements must be met by the repository system, which contains...

  3. 10 CFR 60.3 - License required.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General... byproduct material at a geologic repository operations area except as authorized by a license issued by the Commission pursuant to this part. (b) DOE shall not commence construction of a geologic repository operations...

  4. Asset Reuse of Images from a Repository

    ERIC Educational Resources Information Center

    Herman, Deirdre

    2014-01-01

    According to Markus's theory of reuse, when digital repositories are deployed to collect and distribute organizational assets, they supposedly help ensure accountability, extend information exchange, and improve productivity. Such repositories require a large investment due to the continuing costs of hardware, software, user licenses, training,…

  5. 10 CFR 60.17 - Contents of site characterization plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Licenses Preapplication Review § 60.17 Contents of site characterization plan. The... construction authorization for a geologic repository operations area; (4) Criteria, developed pursuant to... area for the location of a geologic repository; and (5) Any other information which the Commission, by...

  6. 10 CFR 960.4-2-2 - Geochemistry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Postclosure Guidelines § 960.4-2-2 Geochemistry. (a) Qualifying condition. The present and... future, not affect or would favorably affect the ability of the geologic repository to isolate the waste... subjected to expected repository conditions, would remain unaltered or would alter to mineral assemblages...

  7. 17 CFR 49.3 - Procedures for registration.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... if the Commission finds that such swap data repository is appropriately organized, and has the...) SWAP DATA REPOSITORIES § 49.3 Procedures for registration. (a) Application procedures. (1) An applicant, person or entity desiring to be registered as a swap data repository shall file electronically an...

  8. Biological Web Service Repositories Review.

    PubMed

    Urdidiales-Nieto, David; Navas-Delgado, Ismael; Aldana-Montes, José F

    2017-05-01

    Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re-execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re-check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well-known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Finding relevant biomedical datasets: the UC San Diego solution for the bioCADDIE Retrieval Challenge

    PubMed Central

    Wei, Wei; Ji, Zhanglong; He, Yupeng; Zhang, Kai; Ha, Yuanchi; Li, Qi; Ohno-Machado, Lucila

    2018-01-01

    Abstract The number and diversity of biomedical datasets grew rapidly in the last decade. A large number of datasets are stored in various repositories, with different formats. Existing dataset retrieval systems lack the capability of cross-repository search. As a result, users spend time searching datasets in known repositories, and they typically do not find new repositories. The biomedical and healthcare data discovery index ecosystem (bioCADDIE) team organized a challenge to solicit new indexing and searching strategies for retrieving biomedical datasets across repositories. We describe the work of one team that built a retrieval pipeline and examined its performance. The pipeline used online resources to supplement dataset metadata, automatically generated queries from users’ free-text questions, produced high-quality retrieval results and achieved the highest inferred Normalized Discounted Cumulative Gain among competitors. The results showed that it is a promising solution for cross-database, cross-domain and cross-repository biomedical dataset retrieval. Database URL: https://github.com/w2wei/dataset_retrieval_pipeline PMID:29688374

  10. The Nevada initiative: A risk communication Fiasco

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flynn, J.; Solvic, P.; Mertz, C.K.

    The U.S. Congress has designated Yucca Mountain, Nevada as the only potential site to be studied for the nation`s first high-level nuclear waste repository. People in Nevada strongly oppose the program, managed by the U.S. Department of Energy. Survey research shows that the public believes there are great risks from a repository program, in contrast to a majority of scientists who feel the risks are acceptably small. Delays in the repository program resulting in part from public opposition in Nevada have concerned the nuclear power industry, which collects the fees for the federal repository program and believes it needs themore » repository as a final disposal facility for its high-level nuclear wastes. To assist the repository program, the American Nuclear Energy Council (ANEC), an industry group, sponsored a massive advertising campaign in Nevada. The campaign attempted to assure people that the risks of a repository were small and that the repository studies should proceed. The campaign failed because its managers misunderstood the issues underlying the controversy, attempted a covert manipulation of public opinion that was revealed, and most importantly, lacked the public trust that was necessary to communicate credibly about the risks of a nuclear waste facility. This article describes the advertising campaign and its effects. The manner in which the ANEC campaign itself became a controversial public issue is reviewed. The advertising campaign is discussed as it relates to risk assessment and communication. 29 refs., 2 tabs.« less

  11. Collaborative Information Retrieval Method among Personal Repositories

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro

    In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.

  12. Cs sorption to potential host rock of low-level radioactive waste repository in Taiwan: experiments and numerical fitting study.

    PubMed

    Wang, Tsing-Hai; Chen, Chin-Lung; Ou, Lu-Yen; Wei, Yuan-Yaw; Chang, Fu-Lin; Teng, Shi-Ping

    2011-09-15

    A reliable performance assessment of radioactive waste repository depends on better knowledge of interactions between nuclides and geological substances. Numerical fitting of acquired experimental results by the surface complexation model enables us to interpret sorption behavior at molecular scale and thus to build a solid basis for simulation study. A lack of consensus on a standard set of assessment criteria (such as determination of sorption site concentration, reaction formula) during numerical fitting, on the other hand, makes lower case comparison between various studies difficult. In this study we explored the sorption of cesium to argillite by conducting experiments under different pH and solid/liquid ratio (s/l) with two specific initial Cs concentrations (100mg/L, 7.5 × 10(-4)mol/L and 0.01 mg/L, 7.5 × 10(-8)mol/L). After this, numerical fitting was performed, focusing on assessment criteria and their consequences. It was found that both ion exchange and electrostatic interactions governed Cs sorption on argillite. At higher initial Cs concentration the Cs sorption showed an increasing dependence on pH as the solid/liquid ratio was lowered. In contrast at trace Cs levels, the Cs sorption was neither s/l dependent nor pH sensitive. It is therefore proposed that ion exchange mechanism dominates Cs sorption when the concentration of surface sorption site exceeds that of Cs, whereas surface complexation is attributed to Cs uptake under alkaline environments. Numerical fitting was conducted using two different strategies to determine concentration of surface sorption sites: the clay model (based on the cation exchange capacity plus surface titration results) and the iron oxide model (where the concentration of sorption sites is proportional to the surface area of argillite). It was found that the clay model led to better fitting than the iron oxide model, which is attributed to more amenable sorption sites (two specific sorption sites along with larger site density) when using clay model. Moreover, increasing s/l ratio would produce more sorption sites, which helps to suppress the impact of heterogeneous surface on Cs sorption behavior under high pH environments. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Organizing Scientific Data Sets: Studying Similarities and Differences in Metadata and Subject Term Creation

    ERIC Educational Resources Information Center

    White, Hollie C.

    2012-01-01

    Background: According to Salo (2010), the metadata entered into repositories are "disorganized" and metadata schemes underlying repositories are "arcane". This creates a challenging repository environment in regards to personal information management (PIM) and knowledge organization systems (KOSs). This dissertation research is…

  14. Worth the Work

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2010-01-01

    Even in the age of Google, digital repositories can add tremendous value to an institution. Yet creating and maintaining these collections is no small task. Digital repository advocates will concede that the challenges in building and maintaining these collections can daunt even the most intrepid supporters. Three repository directors share their…

  15. Collaboration Nation: The Building of the Welsh Repository Network

    ERIC Educational Resources Information Center

    Knowles, Jacqueline

    2010-01-01

    Purpose: The purpose of this paper is to disseminate information about the Welsh Repository Network (WRN), innovative work being undertaken to build an integrated network of institutional digital repositories. A collaborative approach, in particular through the provision of centralised technical and organisational support, has demonstrated…

  16. 76 FR 81950 - Privacy Act; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...

  17. 10 CFR 60.22 - Filing and distribution of application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GEOLOGIC REPOSITORIES Licenses License Applications § 60.22 Filing and distribution of application. (a) An application for a construction authorization for a high-level radioactive waste repository at a geologic repository operations area, and an application for a license to receive and possess source, special nuclear...

  18. 10 CFR 960.4-2-5 - Erosion.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY... exhumation would not be expected to occur during the first one million years after repository closure. (c... the ability of the geologic repository to isolate the waste. (d) Disqualifying condition. The site...

  19. 10 CFR 960.3-2 - Siting process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-2 Siting process. The siting process begins with site screening... first repository before the enactment of the Act, and the identification of such sites was made after... identification of potentially acceptable sites for the second and subsequent repositories shall be conducted in...

  20. 76 FR 14057 - Notice of Inventory Completion: University of Wyoming, Anthropology Department, Human Remains...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ...: University of Wyoming, Anthropology Department, Human Remains Repository, Laramie, WY AGENCY: National Park... Anthropology Department, Human Remains Repository, Laramie, WY. The human remains and associated funerary... the human remains was made by University of Wyoming, Anthropology Department, Human Remains Repository...

Top