Sample records for repository technology program

  1. US/German Collaboration in Salt Repository Research, Design and Operation - 13243

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steininger, Walter; Hansen, Frank; Biurrun, Enrique

    2013-07-01

    Recent developments in the US and Germany [1-3] have precipitated renewed efforts in salt repository investigations and related studies. Both the German rock salt repository activities and the US waste management programs currently face challenges that may adversely affect their respective current and future state-of-the-art core capabilities in rock salt repository science and technology. The research agenda being pursued by our respective countries leverages collective efforts for the benefit of both programs. The topics addressed by the US/German salt repository collaborations align well with the findings and recommendations summarized in the January 2012 US Blue Ribbon Commission on America's Nuclearmore » Future (BRC) report [4] and are consistent with the aspirations of the key topics of the Strategic Research Agenda of the Implementing Geological Disposal of Radioactive Waste Technology Platform (IGD-TP) [5]. Against this background, a revival of joint efforts in salt repository investigations after some years of hibernation has been undertaken to leverage collective efforts in salt repository research, design, operations, and related issues for the benefit of respective programs and to form a basis for providing an attractive, cost-effective insurance against the premature loss of virtually irreplaceable scientific expertise and institutional memory. (authors)« less

  2. Robotics Scoping Study to Evaluate Advances in Robotics Technologies that Support Enhanced Efficiencies for Yucca Mountain Repository Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T. Burgess; M. Noakes; P. Spampinato

    This paper presents an evaluation of robotics and remote handling technologies that have the potential to increase the efficiency of handling waste packages at the proposed Yucca Mountain High-Level Nuclear Waste Repository. It is expected that increased efficiency will reduce the cost of operations. The goal of this work was to identify technologies for consideration as potential projects that the U.S. Department of Energy Office of Civilian Radioactive Waste Management, Office of Science and Technology International Programs, could support in the near future, and to assess their ''payback'' value. The evaluation took into account the robotics and remote handling capabilitiesmore » planned for incorporation into the current baseline design for the repository, for both surface and subsurface operations. The evaluation, completed at the end of fiscal year 2004, identified where significant advantages in operating efficiencies could accrue by implementing any given robotics technology or approach, and included a road map for a multiyear R&D program for improvements to remote handling technology that support operating enhancements.« less

  3. Repository-Based Software Engineering Program: Working Program Management Plan

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.

  4. The repository-based software engineering program: Redefining AdaNET as a mainstream NASA source

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Repository-based Software Engineering Program (RBSE) is described to inform and update senior NASA managers about the program. Background and historical perspective on software reuse and RBSE for NASA managers who may not be familiar with these topics are provided. The paper draws upon and updates information from the RBSE Concept Document, baselined by NASA Headquarters, Johnson Space Center, and the University of Houston - Clear Lake in April 1992. Several of NASA's software problems and what RBSE is now doing to address those problems are described. Also, next steps to be taken to derive greater benefit from this Congressionally-mandated program are provided. The section on next steps describes the need to work closely with other NASA software quality, technology transfer, and reuse activities and focuses on goals and objectives relative to this need. RBSE's role within NASA is addressed; however, there is also the potential for systematic transfer of technology outside of NASA in later stages of the RBSE program. This technology transfer is discussed briefly.

  5. A "Simple Query Interface" Adapter for the Discovery and Exchange of Learning Resources

    ERIC Educational Resources Information Center

    Massart, David

    2006-01-01

    Developed as part of CEN/ISSS Workshop on Learning Technology efforts to improve interoperability between learning resource repositories, the Simple Query Interface (SQI) is an Application Program Interface (API) for querying heterogeneous repositories of learning resource metadata. In the context of the ProLearn Network of Excellence, SQI is used…

  6. National Programs | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Frederick National Laboratoryis a shared national resource that offers access to a suite of advanced biomedical technologies, provides selected science and technology services, and maintains vast repositories of research materials available

  7. National Programs | FNLCR Staging

    Cancer.gov

    The Frederick National Lab (FNL) is a shared national resource that offers access to a suite of advanced biomedical technologies, provides selected science and technology services, and maintains vast repositories of research materials available to bi

  8. The preliminary design and feasibility study of the spent fuel and high level waste repository in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valvoda, Z.; Holub, J.; Kucerka, M.

    1996-12-31

    In the year 1993, began the Program of Development of the Spent Fuel and High Level Waste Repository in the Conditions of the Czech Republic. During the first phase, the basic concept and structure of the Program has been developed, and the basic design criteria and requirements were prepared. In the conditions of the Czech Republic, only an underground repository in deep geological formation is acceptable. Expected depth is between 500 to 1000 meters and as host rock will be granites. A preliminary variant design study was realized in 1994, that analyzed the radioactive waste and spent fuel flow frommore » NPPs to the repository, various possibilities of transportation in accordance to the various concepts of spent fuel conditioning and transportation to the underground structures. Conditioning and encapsulation of spent fuel and/or radioactive waste is proposed on the repository site. Underground disposal structures are proposed at one underground floor. The repository will have reserve capacity for radioactive waste from NPPs decommissioning and for waste non acceptable to other repositories. Vertical disposal of unshielded canisters in boreholes and/or horizontal disposal of shielded canisters is studied. As the base term of the start up of the repository operation, the year 2035 has been established. From this date, a preliminary time schedule of the Project has been developed. A method of calculating leveled and discounted costs within the repository lifetime, for each of selected 5 variants, was used for economic calculations. Preliminary expected parametric costs of the repository are about 0,1 Kc ($0.004) per MWh, produced in the Czech NPPs. In 1995, the design and feasibility study has gone in more details to the technical concept of repository construction and proposed technologies, as well as to the operational phase of the repository. Paper will describe results of the 1995 design work and will present the program of the repository development in next period.« less

  9. 77 FR 13135 - Agency Information Collection Activities: Submission for Review; Information Collection Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-05

    ... (DHS), Science and Technology, Protected Repository for the Defense of Infrastructure Against Cyber Threats (PREDICT) Program AGENCY: Science and Technology Directorate, DHS. ACTION: 30-Day notice and request for comment. SUMMARY: The Department of Homeland Security (DHS), Science & Technology (S&T...

  10. The Use of Underground Research Laboratories to Support Repository Development Programs. A Roadmap for the Underground Research Facilities Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, Robert J.

    2015-10-26

    Under the auspices of the International Atomic Energy Agency (IAEA), nationally developed underground research laboratories (URLs) and associated research institutions are being offered for use by other nations. These facilities form an Underground Research Facilities (URF) Network for training in and demonstration of waste disposal technologies and the sharing of knowledge and experience related to geologic repository development, research, and engineering. In order to achieve its objectives, the URF Network regularly sponsors workshops and training events related to the knowledge base that is transferable between existing URL programs and to nations with an interest in developing a new URL. Thismore » report describes the role of URLs in the context of a general timeline for repository development. This description includes identification of key phases and activities that contribute to repository development as a repository program evolves from an early research and development phase to later phases such as construction, operations, and closure. This information is cast in the form of a matrix with the entries in this matrix forming the basis of the URF Network roadmap that will be used to identify and plan future workshops and training events.« less

  11. Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buschaert, S.; Lesoille, S.; Bertrand, J.

    2012-07-01

    The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less

  12. Testimony of Dr. Raul A. Deju, Basalt Waste Isolation Project, before the Subcommittee on Energy Research and Production, Committee on Sceince and Technology, United States House of Representatives, March 2, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    Status of the Basalt Waste Isolation Project is given. Three key concerns have been identified that need to be resolved to either confirm or eliminate the basalts as a potential nuclear waste repository host medium. They are: A thorough understanding of the groundwater hydrology beneath the Hanford Site is needed to assure that a repository in basalt will not contribute unacceptable amounts of contaminants to the accessible environment. Our ability to construct a repository shaft and a network of underground tunnels needs to be fully demonstrated through an exploratory shaft program. Our ability to ultimately seal a repository, such thatmore » its integrity and the isolation of the waste are guaranteed, needs to be demonstrated.« less

  13. The Geodetic Seamless Archive Centers Service Layer: A System Architecture for Federating Geodesy Data Repositories

    NASA Astrophysics Data System (ADS)

    McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.

    2010-12-01

    Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.

  14. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  15. Final environmental impact statement. Management of commercially generated radioactive waste. Volume 3. Public comments hearing board report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-10-01

    This EIS analyzes the significant environmental impacts that could occur if various technologies for management and disposal of high-level and transuranic wastes from commercial nuclear power reactors were to be developed and implemented. This EIS will serve as the environmental input for the decision on which technology, or technologies, will be emphasized in further research and development activities in the commercial waste management program. The action proposed in this EIS is to (1) adopt a national strategy to develop mined geologic repositories for disposal of commercially generated high-level and transuranic radioactive waste (while continuing to examine subseabed and very deepmore » hole disposal as potential backup technologies) and (2) conduct a R and D program to develop such facilities and the necessary technology to ensure the safe long-term containment and isolation of these wastes. The Department has considered in this statement: development of conventionally mined deep geologic repositories for disposal of spent fuel from nuclear power reactors and/or radioactive fuel reprocessing wastes; balanced development of several alternative disposal methods; and no waste disposal action. This volume contains written public comments and hearing board responses and reports offered on the draft statement.« less

  16. An overview of platforms for cloud based development.

    PubMed

    Fylaktopoulos, G; Goumas, G; Skolarikis, M; Sotiropoulos, A; Maglogiannis, I

    2016-01-01

    This paper provides an overview of the state of the art technologies for software development in cloud environments. The surveyed systems cover the whole spectrum of cloud-based development including integrated programming environments, code repositories, software modeling, composition and documentation tools, and application management and orchestration. In this work we evaluate the existing cloud development ecosystem based on a wide number of characteristics like applicability (e.g. programming and database technologies supported), productivity enhancement (e.g. editor capabilities, debugging tools), support for collaboration (e.g. repository functionality, version control) and post-development application hosting and we compare the surveyed systems. The conducted survey proves that software engineering in the cloud era has made its initial steps showing potential to provide concrete implementation and execution environments for cloud-based applications. However, a number of important challenges need to be addressed for this approach to be viable. These challenges are discussed in the article, while a conclusion is drawn that although several steps have been made, a compact and reliable solution does not yet exist.

  17. USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15

    DTIC Science & Technology

    2017-05-31

    AFRL-SA-WP-SR-2017-0014 USAF Hearing Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 Daniel A. Williams...Conservation Program, DOEHRS-HC Data Repository Annual Report: CY15 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...Health Readiness System-Hearing Conservation Data Repository (DOEHRS-HC DR). Major command- and installation-level reports are available quarterly

  18. 76 FR 72426 - Agency Information Collection Activities: Submission for Review; Information Collection Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... (DHS), Science and Technology, Protected Repository for the Defense of Infrastructure Against Cyber... the Defense of Infrastructure against Cyber Threats (PREDICT) program, and is a revision of a... operational data for use in cyber security research and development through the establishment of distributed...

  19. Office of Science and Technology&International Year EndReport - 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodvarsson, G.S.

    2005-10-27

    Source Term, Materials Performance, Radionuclide Getters, Natural Barriers, and Advanced Technologies, a brief introduction in each section describes the overall organization and goals of each program area. All of these areas have great potential for improving our understanding of the safety performance of the proposed Yucca Mountain repository, as processes within these areas are generally very conservatively represented in the Total System Performance Assessment. In addition, some of the technology thrust areas in particular may enhance system efficiency and reduce risk to workers. Thus, rather modest effort in the S&T Program could lead to large savings in the lifetime repositorymore » total cost and significantly enhanced understanding of the behavior of the proposed Yucca Mountain repository, without safety being compromised, and in some instances being enhanced. An overall strength of the S&T Program is the significant amount of integration that has already been achieved after two years of research. As an example (illustrated in Figure 1), our understanding of the behavior of the total waste isolation system has been enhanced through integration of the Source Term, Materials Performance, and Natural Barriers Thrust areas. All three thrust areas contribute to the integration of different processes in the in-drift environment. These processes include seepage into the drift, dust accumulation on the waste package, brine formation and precipitation on the waste package, mass transfer through the fuel cladding, changes in the seepage-water chemical composition, and transport of released radionuclides through the invert and natural barriers. During FY2005, each of our program areas assembled a team of external experts to conduct an independent review of their respective projects, research directions, and emphasis. In addition, the S&T Program as a whole was independently reviewed by the S&T Programmatic Evaluation Panel. As a result of these reviews, adjustments to the S&T Program will be implemented in FY2006 to ensure that the Program is properly aligned with OCRWM's priorities. Also during FY2005, several programmatic documents were published, including the Science and Technology Program Strategic Plan, the Science and Technology Program Management Plan, and the Science and Technology Program Plan. These and other communication products are available on the OCRWM web site under the Science and Technology section (http://www.ocrwm.doe.gov/osti/index.shtml).« less

  20. New Features of the re3data Registry of Research Data Repositories

    NASA Astrophysics Data System (ADS)

    Elger, K.; Pampel, H.; Vierkant, P.; Witt, M.

    2016-12-01

    re3data is a registry of research data repositories that lists over 1,600 repositories from around the world, making it the largest and most comprehensive online catalog of data repositories on the web. The registry offers researchers, funding agencies, libraries and publishers a comprehensive overview of the heterogeneous landscape of data repositories. The repositories are described, following the "Metadata Schema for the Description of Research Data Repositories". re3data summarises the properties of a repository into a user-friendly icon system helping users to easily identify an adequate repository for the storage of their datasets. The re3data entries are curated by an international, multi-disciplinary editorial board. An application programming interface (API) enables other information systems to list and fetch metadata for integration and interoperability. Funders like the European Commission (2015) and publishers like Springer Nature (2016) recommend the use of re3data.org in their policies. The original re3data project partners are the GFZ German Research Centre for Geosciences, the Humboldt-Universität zu Berlin, the Purdue University Libraries and the Karlsruhe Institute of Technology (KIT). Since 2015 re3data is operated as a service of DataCite, a global non-profit organisation that provides persistent identifiers (DOIs) for research data. At the 2016 AGU Fall Meeting we will describe the current status of re3data. An overview of the major developments and new features will be given. Furthermore, we will present our plans to increase the quality of the re3data entries.

  1. Combat Stories Map: A Historical Repository and After Action Tool for Capturing, Storing, and Analyzing Georeferenced Individual Combat Narratives

    DTIC Science & Technology

    2016-06-01

    of technology and near-global Internet accessibility, a web -based program incorporating interactive maps to record personal combat experiences does...not exist. The Combat Stories Map addresses this deficiency. The Combat Stories Map is a web -based Geographic Information System specifically designed...iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Despite the proliferation of technology and near-global Internet accessibility, a web

  2. The National Geological and Geophysical Data Preservation Program

    NASA Astrophysics Data System (ADS)

    Dickinson, T. L.; Steinmetz, J. C.; Gundersen, L. C.; Pierce, B. S.

    2006-12-01

    The ability to preserve and maintain geoscience data and collections has not kept pace with the growing need for accessible digital information and the technology to make it so. The Nation has lost valuable and unique geologic records and is in danger of losing much more. Many federal and state geological repositories are currently at their capacity for maintaining and storing data or samples. Some repositories are gaining additional, but temporary and substandard space, using transport containers or offsite warehouses where access is limited and storage conditions are poor. Over the past several years, there has been an increasing focus on the state of scientific collections in the United States. For example, the National Geological and Geophysical Data Preservation Program (NGGDPP) Act was passed as part of the Energy Policy Act of 2005, authorizing $30 million in funding for each of five years. The Act directs the U.S. Geological Survey to administer this program that includes a National Digital Catalog and Federal assistance to support our nation's repositories. Implementation of the Program awaits federal appropriations. The NGGDPP is envisioned as a national network of cooperating geoscience materials and data repositories that are operated independently yet guided by unified standards, procedures, and protocols for metadata. The holdings will be widely accessible through a common and mirrored Internet-based catalog (National Digital Catalog). The National Digital Catalog will tie the observations and analyses to the physical materials they come from. Our Nation's geological and geophysical data are invaluable and in some instances irreplaceable due to the destruction of outcrops, urbanization and restricted access. These data will enable the next generation of scientific research and education, enable more effective and efficient research, and may have future economic benefits through the discovery of new oil and gas accumulations, and mineral deposits.

  3. International Approaches for Nuclear Waste Disposal in Geological Formations: Geological Challenges in Radioactive Waste Isolation—Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Sassani, David

    The overall objective of the Fifth Worldwide Review (WWR-5) is to document the current state-of-the-art of major developments in a number of nations throughout the World pursuing geological disposal programs, and to summarize challenging problems and experience that have been obtained in siting, preparing and reviewing cases for the operational and long-term safety of proposed and operating nuclear waste repositories. The scope of the Review is to address current specific technical issues and challenges in safety case development along with the interplay of technical feasibility, siting, engineering design issues, and operational and post-closure safety. In particular, the chapters included inmore » the report present the following types of information: the current status of the deep geological repository programs for high level nuclear waste and low- and intermediate level nuclear waste in each country, concepts of siting and radioactive waste and spent nuclear fuel management in different countries (with the emphasis of nuclear waste disposal under different climatic conditions and different geological formations), progress in repository site selection and site characterization, technology development, buffer/backfill materials studies and testing, support activities, programs, and projects, international cooperation, and future plans, as well as regulatory issues and transboundary problems.« less

  4. Poster Puzzler Solution: Chill Out | Poster

    Cancer.gov

    A winner has emerged in the most recent Poster Puzzler contest! Congratulations are in order for Rose Bradley, secretary III, Cancer Research Technology Program. The current Poster Puzzler image shows the refrigerant condensers for the two story freezers in the Building 1073 repository, which are used to store samples at -20°C. Put simply, the condensers act like the outdoor

  5. Defense Technical Information Center (DTIC) - Its role in the USAF Scientific and Technical Information Program

    NASA Technical Reports Server (NTRS)

    Kuhn, Allan D.

    1991-01-01

    The Defense Technical Information Center (DTIC), the central repository for DOD scientific and technical information concerning studies and research and engineering efforts, is discussed. The present makeup of DTIC is described and its functions in producing technical reports and technical report bibliographies are examined. DTIC's outreach services are reviewed, as are its DTIC information and technology transfer programs. DTIC's plans for the year 2000 and its relation to the mission of the U.S. Air Force, including the Air Force's STINFO program, are addressed.

  6. OWLing Clinical Data Repositories With the Ontology Web Language

    PubMed Central

    Pastor, Xavier; Lozano, Esther

    2014-01-01

    Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697

  7. OWLing Clinical Data Repositories With the Ontology Web Language.

    PubMed

    Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther

    2014-08-01

    The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.

  8. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  9. Semantic Repositories for eGovernment Initiatives: Integrating Knowledge and Services

    NASA Astrophysics Data System (ADS)

    Palmonari, Matteo; Viscusi, Gianluigi

    In recent years, public sector investments in eGovernment initiatives have depended on making more reliable existing governmental ICT systems and infrastructures. Furthermore, we assist at a change in the focus of public sector management, from the disaggregation, competition and performance measurements typical of the New Public Management (NPM), to new models of governance, aiming for the reintegration of services under a new perspective in bureaucracy, namely a holistic approach to policy making which exploits the extensive digitalization of administrative operations. In this scenario, major challenges are related to support effective access to information both at the front-end level, by means of highly modular and customizable content provision, and at the back-end level, by means of information integration initiatives. Repositories of information about data and services that exploit semantic models and technologies can support these goals by bridging the gap between the data-level representations and the human-level knowledge involved in accessing information and in searching for services. Moreover, semantic repository technologies can reach a new level of automation for different tasks involved in interoperability programs, both related to data integration techniques and service-oriented computing approaches. In this chapter, we discuss the above topics by referring to techniques and experiences where repositories based on conceptual models and ontologies are used at different levels in eGovernment initiatives: at the back-end level to produce a comprehensive view of the information managed in the public administrations' (PA) information systems, and at the front-end level to support effective service delivery.

  10. Preservation of Earth Science Data History with Digital Content Repository Technology

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.

    2011-12-01

    An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.

  11. The Nevada initiative: A risk communication Fiasco

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flynn, J.; Solvic, P.; Mertz, C.K.

    The U.S. Congress has designated Yucca Mountain, Nevada as the only potential site to be studied for the nation`s first high-level nuclear waste repository. People in Nevada strongly oppose the program, managed by the U.S. Department of Energy. Survey research shows that the public believes there are great risks from a repository program, in contrast to a majority of scientists who feel the risks are acceptably small. Delays in the repository program resulting in part from public opposition in Nevada have concerned the nuclear power industry, which collects the fees for the federal repository program and believes it needs themore » repository as a final disposal facility for its high-level nuclear wastes. To assist the repository program, the American Nuclear Energy Council (ANEC), an industry group, sponsored a massive advertising campaign in Nevada. The campaign attempted to assure people that the risks of a repository were small and that the repository studies should proceed. The campaign failed because its managers misunderstood the issues underlying the controversy, attempted a covert manipulation of public opinion that was revealed, and most importantly, lacked the public trust that was necessary to communicate credibly about the risks of a nuclear waste facility. This article describes the advertising campaign and its effects. The manner in which the ANEC campaign itself became a controversial public issue is reviewed. The advertising campaign is discussed as it relates to risk assessment and communication. 29 refs., 2 tabs.« less

  12. Best Practices in NASA's Astrophysics Education and Public Outreach Projects

    NASA Astrophysics Data System (ADS)

    Hasan, H.; Smith, D.

    2015-11-01

    NASA's Astrophysics Education and Public Outreach (EPO) program has partnered scientists and educators since its inception almost twenty years ago, leading to authentic STEM experiences and products widely used by the education and outreach community. We present examples of best practices and representative projects. Keys to success include effective use of unique mission science/technology, attention to audience needs, coordination of effort, robust partnerships and publicly accessible repositories of EPO products. Projects are broadly targeted towards audiences in formal education, informal education, and community engagement. All NASA programs are evaluated for quality and impact. New technology is incorporated to engage young students being raised in the digital age. All projects focus on conveying the excitement of scientific discoveries from NASA's Astrophysics missions, advancing scientific literacy, and engaging students in science and technology careers.

  13. 17 CFR 49.10 - Acceptance of data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... technological protocols established by a swap data repository shall provide for the receipt of swap creation data, swap continuation data, real-time public reporting data, and all other data and information... swap data repository shall adopt policies and procedures, including technological protocols, which...

  14. The NASA Scientific and Technical Information (STI) Program's Implementation of Open Archives Initiation (OAI) for Data Interoperability and Data Exchange

    NASA Technical Reports Server (NTRS)

    Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.

    2002-01-01

    Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.

  15. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  16. Final environmental impact statement. Management of commercially generated radioactive waste. Volume 1 of 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-10-01

    This EIS analyzes the significant environmental impacts that could occur if various technologies for management and disposal of high-level and transuranic wastes from commercial nuclear power reactors were to be developed and implemented. This EIS will serve as the environmental input for the decision on which technology, or technologies, will be emphasized in further research and development activities in the commercial waste management program. The action proposed in this EIS is to (1) adopt a national strategy to develop mined geologic repositories for disposal of commercially generated high-level and transuranic radioactive waste (while continuing to examine subseabed and very deepmore » hole disposal as potential backup technologies) and (2) conduct a R and D program to develop such facilities and the necessary technology to ensure the safe long-term containment and isolation of these wastes. The Department has considered in this statement: development of conventionally mined deep geologic repositories for disposal of spent fuel from nuclear power reactors and/or radioactive fuel reprocessing wastes; balanced development of several alternative disposal methods; and no waste disposal action. This EIS reflects the public review of and comments offered on the draft statement. Included are descriptions of the characteristics of nuclear waste, the alternative disposal methods under consideration, and potential environmental impacts and costs of implementing these methods. Because of the programmatic nature of this document and the preliminary nature of certain design elements assumed in assessing the environmental consequences of the various alternatives, this study has been based on generic, rather than specific, systems. At such time as specific facilities are identified for particular sites, statements addressing site-specific aspects will be prepared for public review and comment.« less

  17. Factors Affecting Faculty Acceptance and Use of Institutional Repositories in Thailand

    ERIC Educational Resources Information Center

    Ammarukleart, Sujira

    2017-01-01

    Institutional repositories have been introduced as an innovative and alternative technology for scholarly communication and have received considerable attention from scholars across disciplines and around the globe. While some universities in Thailand have developed and implemented institutional repositories for nearly a decade, knowledge of the…

  18. Assessing repository technology. Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as prospective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  19. Assessing repository technology: Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as perspective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  20. NeuroVault.org: A repository for sharing unthresholded statistical maps, parcellations, and atlases of the human brain.

    PubMed

    Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A

    2016-01-01

    NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. International Approaches for Nuclear Waste Disposal in Geological Formations: Report on Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter

    2016-09-01

    The goal of the Fifth Worldwide Review is to document evolution in the state-of-the-art of approaches for nuclear waste disposal in geological formations since the Fourth Worldwide Review that was released in 2006. The last ten years since the previous Worldwide Review has seen major developments in a number of nations throughout the world pursuing geological disposal programs, both in preparing and reviewing safety cases for the operational and long-term safety of proposed and operating repositories. The countries that are approaching implementation of geological disposal will increasingly focus on the feasibility of safely constructing and operating their repositories in short-more » and long terms on the basis existing regulations. The WWR-5 will also address a number of specific technical issues in safety case development along with the interplay among stakeholder concerns, technical feasibility, engineering design issues, and operational and post-closure safety. Preparation and publication of the Fifth Worldwide Review on nuclear waste disposal facilitates assessing the lessons learned and developing future cooperation between the countries. The Report provides scientific and technical experiences on preparing for and developing scientific and technical bases for nuclear waste disposal in deep geologic repositories in terms of requirements, societal expectations and the adequacy of cases for long-term repository safety. The Chapters include potential issues that may arise as repository programs mature, and identify techniques that demonstrate the safety cases and aid in promoting and gaining societal confidence. The report will also be used to exchange experience with other fields of industry and technology, in which concepts similar to the design and safety cases are applied, as well to facilitate the public perception and understanding of the safety of the disposal approaches relative to risks that may increase over long times frames in the absence of a successful implementation of final dispositioning.« less

  2. Nuclear energy related capabilities at Sandia National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pickering, Susan Y.

    2014-02-01

    Sandia National Laboratories' technology solutions are depended on to solve national and global threats to peace and freedom. Through science and technology, people, infrastructure, and partnerships, part of Sandia's mission is to meet the national needs in the areas of energy, climate and infrastructure security. Within this mission to ensure clean, abundant, and affordable energy and water is the Nuclear Energy and Fuel Cycle Programs. The Nuclear Energy and Fuel Cycle Programs have a broad range of capabilities, with both physical facilities and intellectual expertise. These resources are brought to bear upon the key scientific and engineering challenges facing themore » nation and can be made available to address the research needs of others. Sandia can support the safe, secure, reliable, and sustainable use of nuclear power worldwide by incorporating state-of-the-art technologies in safety, security, nonproliferation, transportation, modeling, repository science, and system demonstrations.« less

  3. Beyond Academic Evidence: Innovative Uses of Technology Within e-Portfolios in a Doctor of Nursing Practice Program.

    PubMed

    Haverkamp, Jacqueline J; Vogt, Marjorie

    2015-01-01

    Portfolios have been used in higher education for the past several years for assessment of student learning and growth and serve as the basis for summative and formative evaluations. While there is some information in the literature on how undergraduate and graduate medical, nursing, and allied health students might use portfolios to showcase acquired knowledge and skills, there is a dearth of information on the use of e-Portfolios with students in doctor of nursing practice programs. There are also limited findings regarding the creative use of technology (that includes infographics and other multimedia tools) to enhance learning outcomes (Stephens & Parr, 2013). This article presents engaging and meaningful ways technology can be used within e-Portfolios. Thus, e-Portfolios become more than a repository for academic evidence; they become unique stories that reflect the breadth and depth of students' learner-centered outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Principles of Product Quality Control of German Radioactive Waste Forms from the Reprocessing of Spent Fuel: Vitrification, Compaction and Numerical Simulation - 12529

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tietze-Jaensch, Holger; Schneider, Stephan; Aksyutina, Yuliya

    2012-07-01

    The German product quality control is inter alia responsible for control of two radioactive waste forms of heat generating waste: a) homogeneous vitrified HLW and b) heterogeneous compacted hulls, end-pieces and technological metallic waste. In either case, significantly different metrology is employed at the site of the conditioning plant for the obligatory nuclide inventory declaration. To facilitate an independent evaluation and checking of the accompanying documentation numerical simulations are carried out. The physical and chemical properties of radioactive waste residues are used to assess the data consistency and uncertainty margins, as well as to predict the long-term behavior of themore » radioactive waste. This is relevant for repository acceptance and safety considerations. Our new numerical approach follows a bottom-up simulation starting from the burn-up behavior of the fuel elements in the reactor core. The output of these burn-up calculations is then coupled with a program that simulates the material separation in the subsequent dissolution and extraction processes normalized to the mass balance. Follow-up simulations of the separated reprocessing lines of a) the vitrification of highly-active liquid and b) the compaction of residual intermediate-active metallic hulls remaining after fuel pellets dissolution, end-pieces and technological waste, allows calculating expectation values for the various repository relevant properties of either waste stream. The principles of the German product quality control of radioactive waste residues from the spent fuel reprocessing have been introduced and explained. Namely, heat generating homogeneous vitrified HLW and heterogeneous compacted metallic MLW have been discussed. The advantages of a complementary numerical property simulation have been made clear and examples of benefits are presented. We have compiled a new program suite to calculate the physical and radio-chemical properties of common nuclear waste residues. The immediate benefit is the independent assessment of radio-active inventory declarations and much facilitated product quality control of waste residues that need to be returned to Germany and submitted to a German HLW-repository requirements. Wherever possible, internationally accepted standard programs are used and embedded. The innovative coupling of burn-up calculations (SCALE) with neutron and gamma transport codes (MCPN-X) allows an application in the world of virtual waste properties. If-then-else scenarios of hypothetical waste material compositions and distributions provide valuable information of long term nuclide property propagation under repository conditions over a very long time span. Benchmarking the program with real residue data demonstrates the power and remarkable accuracy of this numerical approach, boosting the reliability of the confidence aforementioned numerous applications, namely the proof tool set for on-the-spot production quality checking and data evaluation and independent verification. Moreover, using the numerical bottom-up approach helps to avoid the accumulation of fake activities that may gradually build up in a repository from the so-called conservative or penalizing nuclide inventory declarations. The radioactive waste properties and the hydrolytic and chemical stability can be predicted. The interaction with invasive chemicals can be assessed and propagation scenarios can be developed from reliable and sound data and HLW properties. Hence, the appropriate design of a future HLW repository can be based upon predictable and quality assured waste characteristics. (authors)« less

  5. Personality over Policy: A Comparative History of the Founding and Early Development of Four Significant American Manuscript Repositories of Business, Industry, and Technology

    ERIC Educational Resources Information Center

    Nordberg, Erik C.

    2017-01-01

    This dissertation compares and contrasts the founding and early manuscript collecting activities of four publicly accessible American archival repositories known for their extensive holdings in business, industrial, and technological history: the Baker Library at Harvard University in Boston, Massachusetts; the Hagley Library and Museum in…

  6. Design and Development of an Institutional Repository at the Indian Institute of Technology Kharagpur

    ERIC Educational Resources Information Center

    Sutradhar, B.

    2006-01-01

    Purpose: To describe how an institutional repository (IR) was set up, using open source software, at the Indian Institute of Technology (IIT) in Kharagpur. Members of the IIT can publish their research documents in the IR for online access as well as digital preservation. Material in this IR includes instructional materials, records, data sets,…

  7. Perspectives of Future R and D on HLW Disposal in Germany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steininger, W.J.

    2008-07-01

    The 5. Energy Research Program of the Federal Government 'Innovation and New Technology' is the general framework for R and D activities in radioactive waste disposal. The Ministry of Economics and Technology (BMWi), the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU) and the Ministry of Education and Research (BMBF) apply the Research Program concerning their respective responsibilities and competences. With regard to the Government's obligation to provide repositories for HLW (spent fuel and vitrified HAW) radioactive waste basic and applied R and D is needed in order to make adequate knowledge available to implementers, decision makersmore » and stakeholders in general. Non-site specific R and D projects are funded by BMWi on the basis of its Research Concept. In the first stage (1998 -2001) most R and D issues were focused on R and D activities related to HLW disposal in rock salt. By that time the R and D program had to be revised and some prioritization was demanded due to changes in politics. In the current version (2001 -2006) emphasize was put on non-saline rocks. The current Research Concept of BMWi is presently subjected to a sort of revision, evaluation, and discussion, inter alia, by experts from several German research institutions. This activity is of special importance against the background of streamlining and focusing the research activities to future demands, priorities and perspectives with regard to the salt concept and the option of disposing of HLW in argillaceous media. Because the status of knowledge on disposal in rock salt is well advanced, it is necessary to take stock of the current state-of-the-art. In this framework some key projects are being currently carried out. The results may contribute to future decisions to be made in Germany with respect to HLW disposal. The first project deals with the development of an advanced safety concept for a HLW waste repository in rock salt. The second project (also carried out in the frame of the 6. Framework Program of the European Commission) aims at completing and optimizing the direct disposal concept for spent fuel by a full-scale demonstration of the technology of emplacement in vertical boreholes. The third project is devoted to the development of a reference concept to dispose of HLW in deep geological repository in clay in Germany. In the following a brief overview is given on the achievements, the projects, and ideas about the consequences for HLW disposal in Germany. (author)« less

  8. Rocky Flats Environmental Technology Site Ecological Monitoring Program 1995 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-05-31

    The Ecological Monitoring Program (ECMP) was established at the Rocky Flats Environmental Technology Site (Site) in September 1992. At that time, EcMP staff developed a Program Plan that was peer-reviewed by scientists from western universities before submittal to DOE RFFO in January 1993. The intent of the program is to measure several quantitative variables at different ecological scales in order to characterize the Rocky Flats ecosystem. This information is necessary to document ecological conditions at the Site in impacted and nonimpacted areas to determine if Site practices have had ecological impacts, either positive or negative. This information can be usedmore » by managers interested in future use scenarios and CERCLA activities. Others interested in impact analysis may also find the information useful. In addition, these measurements are entered into a database which will serve as a long-term information repository that will document long-term trends and potential future changes to the Site, both natural and anthropogenic.« less

  9. 76 FR 81950 - Privacy Act; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...

  10. Microsoft Repository Version 2 and the Open Information Model.

    ERIC Educational Resources Information Center

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harmon, K.M.; Lakey, L.T.; Leigh, I.W.

    Worldwide activities related to nuclear fuel cycle and radioactive waste management programs are summarized. Several trends have developed in waste management strategy: All countries having to dispose of reprocessing wastes plan on conversion of the high-level waste (HLW) stream to a borosilicate glass and eventual emplacement of the glass logs, suitably packaged, in a deep geologic repository. Countries that must deal with plutonium-contaminated waste emphasize pluonium recovery, volume reduction and fixation in cement or bitumen in their treatment plans and expect to use deep geologic repositories for final disposal. Commercially available, classical engineering processing are being used worldwide to treatmore » and immobilize low- and intermediate-level wastes (LLW, ILW); disposal to surface structures, shallow-land burial and deep-underground repositories, such as played-out mines, is being done widely with no obvious technical problems. Many countries have established extensive programs to prepare for construction and operation of geologic repositories. Geologic media being studied fall into three main classes: argillites (clay or shale); crystalline rock (granite, basalt, gneiss or gabbro); and evaporates (salt formations). Most nations plan to allow 30 years or longer between discharge of fuel from the reactor and emplacement of HLW or spent fuel is a repository to permit thermal and radioactive decay. Most repository designs are based on the mined-gallery concept, placing waste or spent fuel packages into shallow holes in the floor of the gallery. Many countries have established extensive and costly programs of site evaluation, repository development and safety assessment. Two other waste management problems are the subject of major R and D programs in several countries: stabilization of uranium mill tailing piles; and immobilization or disposal of contaminated nuclear facilities, namely reactors, fuel cycle plants and R and D laboratories.« less

  12. Strategies from a nationwide health information technology implementation: the VA CART story.

    PubMed

    Box, Tamára L; McDonell, Mary; Helfrich, Christian D; Jesse, Robert L; Fihn, Stephan D; Rumsfeld, John S

    2010-01-01

    The VA Cardiovascular Assessment, Reporting, and Tracking (CART) system is a customized electronic medical record system which provides standardized report generation for cardiac catheterization procedures, serves as a national data repository, and is the centerpiece of a national quality improvement program. Like many health information technology projects, CART implementation did not proceed without some barriers and resistance. We describe the nationwide implementation of CART at the 77 VA hospitals which perform cardiac catheterizations in three phases: (1) strategic collaborations; (2) installation; and (3) adoption. Throughout implementation, success required a careful balance of technical, clinical, and organizational factors. We offer strategies developed through CART implementation which are broadly applicable to technology projects aimed at improving the quality, reliability, and efficiency of health care.

  13. Evolution of a Digital Repository: One Institution's Experience

    ERIC Educational Resources Information Center

    Owen, Terry M.

    2011-01-01

    In this article, the development of a digital repository is examined, specifically how the focus on acquiring content for the repository has transitioned from faculty-published research to include the gray literature produced by the research centers on campus, including unpublished technical reports and undergraduate research from honors programs.…

  14. Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF

    NASA Technical Reports Server (NTRS)

    Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.

    2001-01-01

    The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.

  15. ASV3 dial-in interface recommendation for the Repository Based Software Engineering (RBSE) program

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The purpose of this report is to provide insight into the approach and design of the Cooperative User Interface (CUI). The CUI is being developed based on Hypercard technology and will provide the same look and feel as is provided by the NASA Electronic Library System (NELS) X-Window interface. The interaction between the user and ASCII-LIB is presented as well as the set of Hypercard Cards with which the user will work.

  16. Enhancing Ocean Research Data Access

    NASA Astrophysics Data System (ADS)

    Chandler, Cynthia; Groman, Robert; Shepherd, Adam; Allison, Molly; Arko, Robert; Chen, Yu; Fox, Peter; Glover, David; Hitzler, Pascal; Leadbetter, Adam; Narock, Thomas; West, Patrick; Wiebe, Peter

    2014-05-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) works in partnership with ocean science investigators to publish data from research projects funded by the Biological and Chemical Oceanography Sections and the Office of Polar Programs Antarctic Organisms & Ecosystems Program at the U.S. National Science Foundation. Since 2006, researchers have been contributing data to the BCO-DMO data system, and it has developed into a rich repository of data from ocean, coastal and Great Lakes research programs. While the ultimate goal of the BCO-DMO is to ensure preservation of NSF funded project data and to provide open access to those data, achievement of those goals is attained through a series of related phases that benefits from active collaboration and cooperation with a large community of research scientists as well as curators of data and information at complementary data repositories. The BCO-DMO is just one of many intermediate data management centers created to facilitate long-term preservation of data and improve access to ocean research data. Through partnerships with other data management professionals and active involvement in local and global initiatives, BCO-DMO staff members are working to enhance access to ocean research data available from the online BCO-DMO data system. Continuing efforts in use of controlled vocabulary terms, development of ontology design patterns and publication of content as Linked Open Data are contributing to improved discovery and availability of BCO-DMO curated data and increased interoperability of related content available from distributed repositories. We will demonstrate how Semantic Web technologies (e.g. RDF/XML, SKOS, OWL and SPARQL) have been integrated into BCO-DMO data access and delivery systems to better serve the ocean research community and to contribute to an expanding global knowledge network.

  17. Semantic framework for mapping object-oriented model to semantic web languages

    PubMed Central

    Ježek, Petr; Mouček, Roman

    2015-01-01

    The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework. PMID:25762923

  18. International Collaboration Activities on Engineered Barrier Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jove-Colon, Carlos F.

    The Used Fuel Disposition Campaign (UFDC) within the DOE Fuel Cycle Technologies (FCT) program has been engaging in international collaborations between repository R&D programs for high-level waste (HLW) disposal to leverage on gathered knowledge and laboratory/field data of near- and far-field processes from experiments at underground research laboratories (URL). Heater test experiments at URLs provide a unique opportunity to mimetically study the thermal effects of heat-generating nuclear waste in subsurface repository environments. Various configurations of these experiments have been carried out at various URLs according to the disposal design concepts of the hosting country repository program. The FEBEX (Full-scale Engineeredmore » Barrier Experiment in Crystalline Host Rock) project is a large-scale heater test experiment originated by the Spanish radioactive waste management agency (Empresa Nacional de Residuos Radiactivos S.A. – ENRESA) at the Grimsel Test Site (GTS) URL in Switzerland. The project was subsequently managed by CIEMAT. FEBEX-DP is a concerted effort of various international partners working on the evaluation of sensor data and characterization of samples obtained during the course of this field test and subsequent dismantling. The main purpose of these field-scale experiments is to evaluate feasibility for creation of an engineered barrier system (EBS) with a horizontal configuration according to the Spanish concept of deep geological disposal of high-level radioactive waste in crystalline rock. Another key aspect of this project is to improve the knowledge of coupled processes such as thermal-hydro-mechanical (THM) and thermal-hydro-chemical (THC) operating in the near-field environment. The focus of these is on model development and validation of predictions through model implementation in computational tools to simulate coupled THM and THC processes.« less

  19. Semantic framework for mapping object-oriented model to semantic web languages.

    PubMed

    Ježek, Petr; Mouček, Roman

    2015-01-01

    The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.

  20. 75 FR 66110 - Guidelines for Use of Stored Specimens and Access to Ancillary Data and Proposed Cost Schedule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... repository of datasets from completed studies, biospecimens, and ancillary data. The Division intends to make... Sharing Policy. The Division has established an internal committee, the Biospecimen Repository Access and Data Sharing Committee (BRADSC), to oversee the repository access and data sharing program. The purpose...

  1. 75 FR 73095 - Privacy Act of 1974; Report of New System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ... Repository'' System No. 09-70-0587. The final rule for the Medicare and Medicaid EHR Incentive Program... primary purpose of this system, called the National Level Repository or NLR, is to collect, maintain, and... Maintenance of Data in the System The National Level Repository (NLR) contains information on eligible...

  2. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  3. Developments of AMS at the TANDAR accelerator

    NASA Astrophysics Data System (ADS)

    Fernández Niello, J. O.; Abriola, D.; Alvarez, D. E.; Capurro, O. A.; di Tada, M.; Etchegoyen, A.; Ferrero, A. M. J.; Martí, G. V.; Pacheco, A. J.; Testoni, J. E.; Korschinek, G.

    1996-08-01

    Man-made long-lived radioisotopes have been produced as a result of different nuclear technologies. The study of accidental spillages and the determination of radioisotope concentrations in nuclear waste prior to final storage in a repository are subjects of great interest in connection with this activity. The accelerator mass spectrometry (AMS) technique is a powerful tool to measure long-lived isotopes at abundance ratios as low as 10 -12-10 -15 in small samples. Applications to the Argentine nuclear program like those mentioned above, as well as applications to archaeology, hydrology and biomedical research, are considered in an AMS program using the TANDAR 20 UD electrostatic accelerator at Buenos Aires. In this work we present the status of the program and a description of the facility.

  4. Concept document of the repository-based software engineering program: A constructive appraisal

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.

  5. Collaborative Learning Utilizing a Domain-Based Shared Data Repository to Enhance Learning Outcomes

    ERIC Educational Resources Information Center

    Lubliner, David; Widmeyer, George; Deek, Fadi P.

    2009-01-01

    The objective of this study was to determine whether there was a quantifiable improvement in learning outcomes by integrating course materials in a 4-year baccalaureate program, utilizing a knowledge repository with a conceptual map that spans a discipline. Two new models were developed to provide the framework for this knowledge repository. A…

  6. A proposed application programming interface for a physical volume repository

    NASA Technical Reports Server (NTRS)

    Jones, Merritt; Williams, Joel; Wrenn, Richard

    1996-01-01

    The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.

  7. The Geant4 physics validation repository

    DOE PAGES

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  8. Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1996-01-01

    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.

  9. The Listening and Spoken Language Data Repository: Design and Project Overview

    ERIC Educational Resources Information Center

    Bradham, Tamala S.; Fonnesbeck, Christopher; Toll, Alice; Hecht, Barbara F.

    2018-01-01

    Purpose: The purpose of the Listening and Spoken Language Data Repository (LSL-DR) was to address a critical need for a systemwide outcome data-monitoring program for the development of listening and spoken language skills in highly specialized educational programs for children with hearing loss highlighted in Goal 3b of the 2007 Joint Committee…

  10. 17 CFR 49.10 - Acceptance of data.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... protocols established by a swap data repository shall provide for the receipt of swap creation data, swap continuation data, real-time public reporting data, and all other data and information required to be reported... repository shall adopt policies and procedures, including technological protocols, which provide for...

  11. 17 CFR 49.10 - Acceptance of data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... protocols established by a swap data repository shall provide for the receipt of swap creation data, swap continuation data, real-time public reporting data, and all other data and information required to be reported... repository shall adopt policies and procedures, including technological protocols, which provide for...

  12. Space Telecommunications Radio System (STRS) Application Repository Design and Analysis

    NASA Technical Reports Server (NTRS)

    Handler, Louis M.

    2013-01-01

    The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.

  13. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results

    DTIC Science & Technology

    2017-08-01

    This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of

  14. Cost Reduction Through the Use of Additive Manufacturing (3d Printing) and Collaborative Product Life Cycle Management Technologies to Enhance the Navy’s Maintenance Programs

    DTIC Science & Technology

    2013-09-01

    the order from the DLA; convenes a meeting with tech librarians , engineers, machinists, quality assurance (QA) inspectors, and mechanics to assess...created, begins the in-house process. 3. Research of Technical Drawings The tech librarian reviews the applicable repository for any tech drawings...applicable to Widget A. If none are found, the tech librarian contacts the OEM and other D-Level activities to find out whether the tech drawing is out

  15. Transuranic inventory reduction in repository by partitioning and transmutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, C.H.; Kazimi, M.S.

    1992-01-01

    The promise of a new reprocessing technology and the issuance of Environmental Protection Agency (EPA) and U.S. Nuclear Regulatory Commission regulations concerning a geologic repository rekindle the interest in partitioning and transmutation of transuranic (TRU) elements from discharged reactor fuel as a high level waste management option. This paper investigates the TRU repository inventory reduction capability of the proposed advanced liquid metal reactors (ALMRs) and integral fast reactors (IFRs) as well as the plutonium recycled light water reactors (LWRs).

  16. Mont Terri Underground Rock Laboratory, Switzerland-Research Program And Key Results

    NASA Astrophysics Data System (ADS)

    Nussbaum, C. O.; Bossart, P. J.

    2012-12-01

    Argillaceous formations generally act as aquitards because of their low hydraulic conductivities. This property, together with the large retention capacity of clays for cationic contaminants and the potential for self-sealing, has brought clay formations into focus as potential host rocks for the geological disposal of radioactive waste. Excavated in the Opalinus Clay formation, the Mont Terri underground rock laboratory in the Jura Mountains of NW Switzerland is an important international test site for researching clay formations. Research is carried out in the underground facility, which is located adjacent to the security gallery of the Mont Terri motorway tunnel. Fifteen partners from European countries, USA, Canada and Japan participate in the project. The objectives of the research program are to analyze the hydrogeological, geochemical and rock mechanical properties of the Opalinus Clay, to determine the changes induced by the excavation of galleries and by heating of the rock formation, to test sealing and container emplacement techniques and to evaluate and improve suitable investigation techniques. For the safety of deep geological disposal, it is of key importance to understand the processes occurring in the undisturbed argillaceous environment, as well as the processes in a disturbed system, during the operation of the repository. The objectives are related to: 1. Understanding processes and mechanisms in undisturbed clays and 2. Experiments related to repository-induced perturbations. Experiments of the first group are dedicated to: i) Improvement of drilling and excavation technologies and sampling methods; ii) Estimation of hydrogeological, rock mechanical and geochemical parameters of the undisturbed Opalinus Clay. Upscaling of parameters from laboratory to in situ scale; iii) Geochemistry of porewater and natural gases; evolution of porewater over time scales; iv) Assessment of long-term hydraulic transients associated with erosion and thermal scenarios and v) Evaluation of diffusion and retention parameters for long-lived radionuclides. Experiments related to repository-induced perturbations are focused on: i) Influence of rock liner on the disposal system and the buffering potential of the host rock; ii) Self-sealing processes in the excavation damaged zone; iii) Hydro-mechanical coupled processes (e.g. stress redistributions and pore pressure evolution during excavation); iv) Thermo-hydro-mechanical-chemical coupled processes (e.g. heating of bentonite and host rock) and v) Gas-induced transport of radionuclides in porewater and along interfaces in the engineered barrier system. A third research direction is to demonstrate the feasibility of repository construction and long-term safety after repository closure. Demonstration experiments can contribute to improving the reliability of the scientific basis for the safety assessment of future geological repositories, particularly if they are performed on a large scale and with a long duration. These experiments include the construction and installation of engineered barriers on a 1:1 scale: i) Horizontal emplacement of canisters; ii) Evaluation of the corrosion of container materials; repository re-saturation; iii) Sealing of boreholes and repository access tunnels and iv) Long-term monitoring of the repository. References Bossart, P. & Thury, M. (2008): Mont Terri Rock Laboratory. Project, Programme 1996 to 2007 and Results. - Rep. Swiss Geol. Surv. 3.

  17. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2016-10-01

    above for publication in a peer-reviewed journal 4. Expand the release of the Clinical Scenario Repository (CSR), also known as “Good Ideas for...the Clinical Scenario Repository (CSR). The CSR pilot with the American Society of Anesthesiologists (ASA) Committee on Patient Safety and Education

  18. Embracing the Future: Embedding Digital Repositories in the University of London. Technical Report

    ERIC Educational Resources Information Center

    Hoorens, Stijn; van Dijk, Lidia Villalba; van Stolk, Christian

    2008-01-01

    Digital repositories can help Higher Education Institutions (HEIs) to develop coherent and coordinated approaches to capture, identify, store and retrieve intellectual assets such as datasets, course material and research papers. With the advances of technology, an increasing number of Higher Education Institutions are implementing digital…

  19. Repository-based software engineering program

    NASA Technical Reports Server (NTRS)

    Wilson, James

    1992-01-01

    The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.

  20. Coupled Biological-Geomechanical-Geochemical Effects of the Disturbed Rock Zone on the Performance of the Waste Isolation Pilot Plant

    NASA Astrophysics Data System (ADS)

    Dunagan, S. C.; Herrick, C. G.; Lee, M. Y.

    2008-12-01

    The Waste Isolation Pilot Plant (WIPP) is located at a depth of 655 m in bedded salt in southeastern New Mexico and is operated by the U.S. Department of Energy as a deep underground disposal facility for transuranic (TRU) waste. The WIPP must comply with the EPA's environmental regulations that require a probabilistic risk analysis of releases of radionuclides due to inadvertent human intrusion into the repository at some time during the 10,000-year regulatory period. Sandia National Laboratories conducts performance assessments (PAs) of the WIPP using a system of computer codes representing the evolution of underground repository and emplaced TRU waste in order to demonstrate compliance. One of the important features modeled in a PA is the disturbed rock zone (DRZ) surrounding the emplacement rooms in the repository. The extent and permeability of DRZ play a significant role in the potential radionuclide release scenarios. We evaluated the phenomena occurring in the repository that affect the DRZ and their potential effects on the extent and permeability of the DRZ. Furthermore, we examined the DRZ's role in determining the performance of the repository. Pressure in the completely sealed repository will be increased by creep closure of the salt and degradation of TRU waste contents by microbial activity in the repository. An increased pressure in the repository will reduce the extent and permeability of the DRZ. The reduced DRZ extent and permeability will decrease the amount of brine that is available to interact with the waste. Furthermore, the potential for radionuclide release from the repository is dependent on the amount of brine that enters the repository. As a result of these coupled biological-geomechanical-geochemical phenomena, the extent and permeability of the DRZ has a significant impact on the potential radionuclide releases from the repository and, in turn, the repository performance. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.

  1. Efficiently Maintaining a National Resource of Historical and Contemporary Biological Collections: The NHLBI Biorepository Model.

    PubMed

    Shea, Katheryn E; Wagner, Elizabeth L; Marchesani, Leah; Meagher, Kevin; Giffen, Carol

    2017-02-01

    Reducing costs by improving storage efficiency has been a focus of the National Heart, Lung, and Blood Institute (NHLBI) Biologic Specimen Repository (Biorepository) and Biologic Specimen and Data Repositories Information Coordinating Center (BioLINCC) programs for several years. Study specimen profiles were compiled using the BioLINCC collection catalog. Cost assessments and calculations on the return on investments to consolidate or reduce a collection, were developed and implemented. Over the course of 8 months, the NHLBI Biorepository evaluated 35 collections that consisted of 1.8 million biospecimens. A total of 23 collections were selected for consolidation, with a total of 1.2 million specimens located in 21,355 storage boxes. The consolidation resulted in a savings of 4055 boxes of various sizes and 10.2 mechanical freezers (∼275 cubic feet) worth of space. As storage costs in a biorepository increase over time, the development and use of information technology tools to assess the potential advantage and feasiblity of vial consolidation can reduce maintenance expenses.

  2. Cancer Epidemiology Data Repository (CEDR)

    Cancer.gov

    In an effort to broaden access and facilitate efficient data sharing, the Epidemiology and Genomics Research Program (EGRP) has created the Cancer Epidemiology Data Repository (CEDR), a centralized, controlled-access database, where Investigators can deposit individual-level de-identified observational cancer datasets.

  3. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Data repositories for medical education research: issues and recommendations.

    PubMed

    Schwartz, Alan; Pappas, Cleo; Sandlow, Leslie J

    2010-05-01

    The authors explore issues surrounding digital repositories with the twofold intention of clarifying their creation, structure, content, and use, and considering the implementation of a global digital repository for medical education research data sets-an online site where medical education researchers would be encouraged to deposit their data in order to facilitate the reuse and reanalysis of the data by other researchers. By motivating data sharing and reuse, investigators, medical schools, and other stakeholders might see substantial benefits to their own endeavors and to the progress of the field of medical education.The authors review digital repositories in medicine, social sciences, and education, describe the contents and scope of repositories, and present extant examples. The authors describe the potential benefits of a medical education data repository and report results of a survey of the Society for Directors of Research in Medicine Education, in which participants responded to questions about data sharing and a potential data repository. Respondents strongly endorsed data sharing, with the caveat that principal investigators should choose whether or not to share data they collect. A large majority believed that a repository would benefit their unit and the field of medical education. Few reported using existing repositories. Finally, the authors consider challenges to the establishment of such a repository, including taxonomic organization, intellectual property concerns, human subjects protection, technological infrastructure, and evaluation standards. The authors conclude with recommendations for how a medical education data repository could be successfully developed.

  5. Quality Assurance for Digital Learning Object Repositories: Issues for the Metadata Creation Process

    ERIC Educational Resources Information Center

    Currier, Sarah; Barton, Jane; O'Beirne, Ronan; Ryan, Ben

    2004-01-01

    Metadata enables users to find the resources they require, therefore it is an important component of any digital learning object repository. Much work has already been done within the learning technology community to assure metadata quality, focused on the development of metadata standards, specifications and vocabularies and their implementation…

  6. Working paper : the ITS cost data repository at Mitretek Systems

    DOT National Transportation Integrated Search

    1998-11-30

    Mitretek Systems has been tasked by the Intelligent Transportation Systems (ITS) Joint Program Office (JPO) to collect available information on ITS costs and maintain the information in a cost database, which serves as the ITS Cost Data Repository. T...

  7. Army Hearing Program Talking Points Calendar Year 2015

    DTIC Science & Technology

    2016-12-14

    outside the range of normal hearing sensitivity (greater than 25 dB), CY15 data.  Data: DOEHRS-HC Data Repository , Soldiers who had a DD2215 or...1.  Data: Defense Occupational and Environmental Health Readiness System-Hearing Conservation (DOEHRS-HC) Data Repository , CY15—Army Profile...Soldiers have a hearing loss that required a fit-for-duty (Readiness) evaluation:  An H-3 Hearing Profile.  Data: DOEHRS-HC Data Repository

  8. An ontology based information system for the management of institutional repository's collections

    NASA Astrophysics Data System (ADS)

    Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.

    2015-02-01

    In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.

  9. Disposal of high-level nuclear waste above the water table in arid regions

    USGS Publications Warehouse

    Roseboom, Eugene H.

    1983-01-01

    Locating a repository in the unsaturated zone of arid regions eliminates or simplifies many of the technological problems involved in designing a repository for operation below the water table and predicting its performance. It also offers possible accessibility and ease of monitoring throughout the operational period and possible retrieval of waste long after. The risks inherent in such a repository appear to be no greater than in one located in the saturated zone; in fact, many aspects of such a repository's performance will be much easier to predict and the uncertainties will be reduced correspondingly. A major new concern would be whether future climatic changes could produce significant consequences due to possible rise of the water table or increased flux of water through the repository. If spent fuel were used as a waste form, a second new concern would be the rates of escape of gaseous iodine-129 and carbon-14 to the atmosphere.

  10. The Galileo Teacher Training Programme

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    The Galileo Teacher Training Program is a global effort to empower teachers all over the world to embark on a new trend in science teaching, using new technologies and real research meth-ods to teach curriculum content. The GTTP goal is to create a worldwide network of "Galileo Ambassadors", promoters of GTTP training session, and a legion of "Galileo Teachers", edu-cators engaged on the use of innovative resources and sharing experiences and supporting its pears worldwide. Through workshops, online training tools and resources, the products and techniques promoted by this program can be adapted to reach locations with few resources of their own, as well as network-connected areas that can take advantage of access to robotic, optical and radio telescopes, webcams, astronomy exercises, cross-disciplinary resources, image processing and digital universes (web and desktop planetariums). Promoters of GTTP are expert astronomy educators connected to Universities or EPO institutions that facilitate the consolidation of an active support to newcomers and act as a 24 hour helpdesk to teachers all over the world. GTTP will also engage in the creation of a repository of astronomy education resources and science research projects, ViRoS (Virtual Repository of resources and Science Projects), in order to simplify the task of educators willing to enrich classroom activities.

  11. YUCCA MOUNTAIN PROJECT - A BRIEFING --

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NA

    2003-08-05

    This report has the following articles: Nuclear waste--a long-term national problem; Spent nuclear fuel; High-level radioactive waste; Radioactivity and the environment; Current storage methods; Disposal options; U.S. policy on nuclear waste; The focus on Yucca Mountain; The purpose and scope of the Yucca Mountain Project; The approach for permanently disposing of waste; The scientific studies at Yucca Mountain; The proposed design for a repository at Yucca Mountain; Natural and engineered barriers would work together to isolate waste; Meticulous science and technology to protect people and the environment; Licensing a repository; Transporting waste to a permanent repository; The Environmental Impact Statementmore » for a repository; Current status of the Yucca Mountain Project; and Further information available on the Internet.« less

  12. Integration and Cooperation in the Next Golden Age of Human Space Flight Data Repositories: Tools for Retrospective Analysis and Future Planning

    NASA Technical Reports Server (NTRS)

    Thomas, D.; Fitts, M.; Wear, M.; VanBaalen, M.

    2011-01-01

    As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health Repository (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future research studies.

  13. Fifth NASA Goddard Conference on Mass Storage Systems and Technologies. Volume 2

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1996-01-01

    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies held September 17 - 19, 1996, at the University of Maryland, University Conference Center in College Park, Maryland. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.

  14. Metrology for decommissioning nuclear facilities: Partial outcomes of joint research project within the European Metrology Research Program.

    PubMed

    Suran, Jiri; Kovar, Petr; Smoldasova, Jana; Solc, Jaroslav; Van Ammel, Raf; Garcia Miranda, Maria; Russell, Ben; Arnold, Dirk; Zapata-García, Daniel; Boden, Sven; Rogiers, Bart; Sand, Johan; Peräjärvi, Kari; Holm, Philip; Hay, Bruno; Failleau, Guillaume; Plumeri, Stephane; Laurent Beck, Yves; Grisa, Tomas

    2018-04-01

    Decommissioning of nuclear facilities incurs high costs regarding the accurate characterisation and correct disposal of the decommissioned materials. Therefore, there is a need for the implementation of new and traceable measurement technologies to select the appropriate release or disposal route of radioactive wastes. This paper addresses some of the innovative outcomes of the project "Metrology for Decommissioning Nuclear Facilities" related to mapping of contamination inside nuclear facilities, waste clearance measurement, Raman distributed temperature sensing for long term repository integrity monitoring and validation of radiochemical procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. High Integrity Can Design Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaber, E.L.

    1998-08-01

    The National Spent Nuclear Fuel Program is chartered with facilitating the disposition of DOE-owned spent nuclear fuel to allow disposal at a geologic repository. This is done through coordination with the repository program and by assisting DOE Site owners of SNF with needed information, standardized requirements, packaging approaches, etc. The High Integrity Can (HIC) will be manufactured to provide a substitute or barrier enhancement for normal fuel geometry and cladding. The can would be nested inside the DOE standardized canister which is designed to interface with the repository waste package. The HIC approach may provide the following benefits over typicalmore » canning approaches for DOE SNF. (a) It allows ready calculation and management of criticality issues for miscellaneous. (b) It segments and further isolates damaged or otherwise problem materials from normal SNF in the repository package. (c) It provides a very long term corrosion barrier. (d) It provides an extra internal pressure barrier for particulates, gaseous fission products, hydrogen, and water vapor. (e) It delays any potential release of fission products to the repository environment. (f) It maintains an additional level of fuel geometry control during design basis accidents, rock-fall, and seismic events. (g) When seal welded, it could provide the additional containment required for shipments involving plutonium content in excess of 20 Ci. (10 CFR 71.63.b) if integrated with an appropriate cask design. Long term corrosion protection is central to the HIC concept. The material selected for the HIC (Hastelloy C-22) has undergone extensive testing for repository service. The most severe theoretical interactions between iron, repository water containing chlorides and other repository construction materials have been tested. These expected chemical species have not been shown capable of corroding the selected HIC material. Therefore, the HIC should provide a significant barrier to DOE SNF dispersal long after most commercial SNF has degraded and begun moving into the repository environment.« less

  16. Implementing and Sustaining Data Lifecycle best Practices: a Framework for Researchers and Repositories

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2016-02-01

    Emerging data management mandates in conjunction with cross-domain international interoperability are posing new challenges for researchers and repositories. Domain repositories are serving in this critical, growing role monitoring and leading data management standards and capability within their own repository and working on mappings between repositories internationally. Leading research institutions and companies will also be important as they develop and expand data curation efforts. This landscape poses a number of challenges for developing and ensuring the use of best practices in curating research data, enabling discovery, elevating quality across diverse repositories, and helping researchers collect and organize it through the full data life cycle. This multidimensional challenge will continue to grow in complexity. The American Geophysical Union (AGU) is developing two programs to help researchers and data repositories develop and elevate best practices and address these challenges. The goal is to provide tools for the researchers and repositories, whether domain, institutional, or other, that improve performance throughout the data lifecycle across the Earth and space science community. For scientists and researchers, AGU is developing courses around handling data that can lead toward a certification in geoscience data management. Course materials will cover metadata management and collection, data analysis, integration of data, and data presentation. The course topics are being finalized by the advisory board with the first one planned to be available later this year. AGU is also developing a program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM framework within the Earth and space sciences. A data management assessment using the DMMSM involves identifying accomplishments and weaknesses compared to leading practices for data management. Recommendations can help improve quality and consistency across the community that will facilitate reuse in the data lifecycle. Through governance, quality, and architecture process areas the assessment can measure the ability for data to be discoverable and interoperable.

  17. Scientific information repository assisting reflectance spectrometry in legal medicine.

    PubMed

    Belenki, Liudmila; Sterzik, Vera; Bohnert, Michael; Zimmermann, Klaus; Liehr, Andreas W

    2012-06-01

    Reflectance spectrometry is a fast and reliable method for the characterization of human skin if the spectra are analyzed with respect to a physical model describing the optical properties of human skin. For a field study performed at the Institute of Legal Medicine and the Freiburg Materials Research Center of the University of Freiburg, a scientific information repository has been developed, which is a variant of an electronic laboratory notebook and assists in the acquisition, management, and high-throughput analysis of reflectance spectra in heterogeneous research environments. At the core of the repository is a database management system hosting the master data. It is filled with primary data via a graphical user interface (GUI) programmed in Java, which also enables the user to browse the database and access the results of data analysis. The latter is carried out via Matlab, Python, and C programs, which retrieve the primary data from the scientific information repository, perform the analysis, and store the results in the database for further usage.

  18. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  19. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  20. Credentialing Data Scientists: A Domain Repository Perspective

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Furukawa, H.

    2015-12-01

    A career in data science can have many paths: data curation, data analysis, metadata modeling - all of these in different commercial or scientific applications. Can a certification as 'data scientist' provide the guarantee that an applicant or candidate for a data science position has just the right skills? How valuable is a 'generic' certification as data scientist for an employer looking to fill a data science position? Credentials that are more specific and discipline-oriented may be more valuable to both the employer and the job candidate. One employment sector for data scientists are the data repositories that provide discipline-specific data services for science communities. Data science positions within domain repositories include a wide range of responsibilities in support of the full data life cycle - from data preservation and curation to development of data models, ontologies, and user interfaces, to development of data analysis and visualization tools to community education and outreach, and require a substantial degree of discipline-specific knowledge of scientific data acquisition and analysis workflows, data quality measures, and data cultures. Can there be certification programs for domain-specific data scientists that help build the urgently needed workforce for the repositories? The American Geophysical Union has recently started an initiative to develop a program for data science continuing education and data science professional certification for the Earth and space sciences. An Editorial Board has been charged to identify and develop curricula and content for these programs and to provide input and feedback in the implementation of the program. This presentation will report on the progress of this initiative and evaluate its utility for the needs of domain repositories in the Earth and space sciences.

  1. U.S. Department of Energy's initiatives for proliferation prevention program: solidification technologies for radioactive waste treatment in Russia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokhitonov, Y.; Kelley, D.

    Large amounts of liquid radioactive waste have existed in the U.S. and Russia since the 1950's as a result of the Cold War. Comprehensive action to treat and dispose of waste products has been lacking due to insufficient funding, ineffective technologies or no proven technologies, low priority by governments among others. Today the U.S. and Russian governments seek new, more reliable methods to treat liquid waste, in particular the legacy waste streams. A primary objective of waste generators and regulators is to find economical and proven technologies that can provide long-term stability for repository storage. In 2001, the V.G. Khlopinmore » Radium Institute (Khlopin), St. Petersburg, Russia, and Pacific Nuclear Solutions (PNS), Indianapolis, Indiana, began extensive research and test programs to determine the validity of polymer technology for the absorption and immobilization of standard and complex waste streams. Over 60 liquid compositions have been tested including extensive irradiation tests to verify polymer stability and possible degradation. With conclusive scientific evidence of the polymer's effectiveness in treating liquid waste, both parties have decided to enter the Russian market and offer the solidification technology to nuclear sites for waste treatment and disposal. In conjunction with these efforts, the U.S. Department of Energy (DOE) will join Khlopin and PNS to explore opportunities for direct application of the polymers at predetermined sites and to conduct research for new product development. Under DOE's 'Initiatives for Proliferation Prevention'(IPP) program, funding will be provided to the Russian participants over a three year period to implement the program plan. This paper will present details of U.S. DOE's IPP program, the project structure and its objectives both short and long-term, training programs for scientists, polymer tests and applications for LLW, ILW and HLW, and new product development initiatives. (authors)« less

  2. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  3. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  4. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  5. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  6. 10 CFR 60.51 - License amendment for permanent closure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... description of the program for post-permanent closure monitoring of the geologic repository. (2) A detailed... postclosure controlled area and geologic repository operations area by monuments that have been designed... tests, experiments, and any other analyses relating to backfill of excavated areas, shaft sealing, waste...

  7. The impact of using mobile-enabled devices on patient engagement in remote monitoring programs.

    PubMed

    Agboola, Stephen; Havasy, Rob; Myint-U, Khinlei; Kvedar, Joseph; Jethwani, Kamal

    2013-05-01

    Different types of data transmission technologies are used in remote monitoring (RM) programs. This study reports on a retrospective analysis of how participants engage, based on the type of data transfer technology used in a blood pressure (BP) RM program, and its potential impact on RM program design and outcomes. Thirty patients, aged 23-84 years (62 ± 14 years), who had completed at least 2 months in the program and were not participating in any other clinical trial were identified from the Remote Monitoring Data Repository. Half of these patients used wireless-based data transfer devices [wireless-based device (WBD)] while the other half used telephone modem-based data transfer devices [modem-based device (MBD)]. Participants were matched by practice and age. Engagement indices, which include frequency of BP measurements, frequency of data uploads, time to first BP measurement, and time to first data upload, were compared in both groups using the Wilcoxon-Mann-Whitney two-sample rank-sum test. Help desk call data were analyzed by Chi square test. The frequency of BP measurements and data uploads was significantly higher in the WBD group versus the MBD group [median = 0.66 versus 0.2 measurements/day (p = .01) and 0.46 versus 0.01 uploads/day (p < .001), respectively]. Time to first upload was significantly lower in the WBD group (median = 4 versus 7 days; p = .02), but time to first BP measurement did not differ between the two groups (median = 2 versus 1 day; p = .98). Wireless transmission ensures instantaneous transmission of readings, providing clinicians timely data to intervene on. Our findings suggest that mobile-enabled wireless technologies can positively impact patient engagement, outcomes, and operational workflow in RM programs. © 2013 Diabetes Technology Society.

  8. The G-Portal Digital Repository as a Potentially Disruptive Pedagogical Innovation

    ERIC Educational Resources Information Center

    Hedberg, John G.; Chang, Chew-Hung

    2007-01-01

    Christensen defined a disruptive innovation or technology as one that eventually takes over the existing dominant technology in the market, despite the fact that the disruptive technology is both radically different to the leading technology and often initially performs worse than the leading technology according to existing measures of…

  9. Establishment of Peripheral Nerve Injury Data Repository to Monitor and Support Population Health Decisions

    DTIC Science & Technology

    2017-07-01

    AWARD NUMBER: W81XWH-16-0-DM167033 TITLE: Establishment of Peripheral Nerve Injury Data Repository to Monitor and Support Population Health...Injury Data Repository to Monitor and Support Population Health Decisions 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-16-0-DM167033 5c. PROGRAM...patient enrollment. Collected data will be utilized to 1) describe the outcomes of various PNI and 2) suggest outcomes that support population health

  10. A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.

    2013-07-01

    The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less

  11. Life Sciences Data Archives (LSDA) in the Post-Shuttle Era

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre

    2010-01-01

    Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management-the systematic use and reuse of information, experience, and expertise to achieve a specific goal-are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries for the science communities. Information about experiments and data, as well as non-attributable human data and data from other species' are available on our public Web site http://lsda.jsc.nasa.gov. The Web site also includes a repository for biospecimens, and a utilization process. NASA has undertaken an initiative to develop a Shuttle Data Archive repository. The Shuttle program is nearing its end in 2010 and it is critical that the medical and research data related to the Shuttle program be captured, retained, and usable for research, lessons learned, and future mission planning. Communities of practice are groups of people who share a concern or a passion for something they do, and learn how to do it better as they interact regularly. LSDA works with the HRP community of practice to ensure that we are preserving the relevant research and data they need in the LSDA repository. An evidence-based approach to risk management is required in space life sciences. Evidence changes over time. LSDA has a pilot project with Collexis, a new type of Web-based search engine. Collexis differentiates itself from full-text search engines by making use of thesauri for information retrieval. The high-quality search is based on semantics that have been defined in a life sciences ontology. Additionally, Collexis' matching technology is unique, allowing discovery of partially matching dicuments. Users do not have to construct a complicated (Boolean) search query, but can simply enter a free text search without the risk of getting "no results". Collexis may address these issues by virtue of its retrieval and discovery capabilities across multiple repositories.

  12. Developing an Automatic Crawling System for Populating a Digital Repository of Professional Development Resources: A Pilot Study

    ERIC Educational Resources Information Center

    Park, Jung-ran; Yang, Chris; Tosaka, Yuji; Ping, Qing; Mimouni, Houda El

    2016-01-01

    This study is a part of the larger project that develops a sustainable digital repository of professional development resources on emerging data standards and technologies for data organization and management in libraries. Toward that end, the project team developed an automated workflow to crawl for, monitor, and classify relevant web objects…

  13. Loose, Falling Characters and Sentences: The Persistence of the OCR Problem in Digital Repository E-Books

    ERIC Educational Resources Information Center

    Kichuk, Diana

    2015-01-01

    The electronic conversion of scanned image files to readable text using optical character recognition (OCR) software and the subsequent migration of raw OCR text to e-book text file formats are key remediation or media conversion technologies used in digital repository e-book production. Despite real progress, the OCR problem of reliability and…

  14. Workshop on development of radionuclide getters for the Yucca Mountain waste repository: proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Robert Charles; Lukens, Wayne W.

    The proposed Yucca Mountain repository, located in southern Nevada, is to be the first facility for permanent disposal of spent reactor fuel and high-level radioactive waste in the United States. Total Systems Performance Assessment (TSPA) analysis has indicated that among the major radionuclides contributing to dose are technetium, iodine, and neptunium, all of which are highly mobile in the environment. Containment of these radionuclides within the repository is a priority for the Yucca Mountain Project (YMP). These proceedings review current research and technology efforts for sequestration of the radionuclides with a focus on technetium, iodine, and neptunium. This workshop alsomore » covered issues concerning the Yucca Mountain environment and getter characteristics required for potential placement into the repository.« less

  15. Measurement and Analysis of P2P IPTV Program Resource

    PubMed Central

    Chen, Xingshu; Wang, Haizhou; Zhang, Qi

    2014-01-01

    With the rapid development of P2P technology, P2P IPTV applications have received more and more attention. And program resource distribution is very important to P2P IPTV applications. In order to collect IPTV program resources, a distributed multi-protocol crawler is proposed. And the crawler has collected more than 13 million pieces of information of IPTV programs from 2009 to 2012. In addition, the distribution of IPTV programs is independent and incompact, resulting in chaos of program names, which obstructs searching and organizing programs. Thus, we focus on characteristic analysis of program resources, including the distributions of length of program names, the entropy of the character types, and hierarchy depth of programs. These analyses reveal the disorderly naming conventions of P2P IPTV programs. The analysis results can help to purify and extract useful information from chaotic names for better retrieval and accelerate automatic sorting of program and establishment of IPTV repository. In order to represent popularity of programs and to predict user behavior and popularity of hot programs over a period, we also put forward an analytical model of hot programs. PMID:24772008

  16. Repository-Based Software Engineering (RBSE) program

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Support of a software engineering program was provided in the following areas: client/customer liaison; research representation/outreach; and program support management. Additionally, a list of deliverables is presented.

  17. Recent technology products from Space Human Factors research

    NASA Technical Reports Server (NTRS)

    Jenkins, James P.

    1991-01-01

    The goals of the NASA Space Human Factors program and the research carried out concerning human factors are discussed with emphasis given to the development of human performance models, data, and tools. The major products from this program are described, which include the Laser Anthropometric Mapping System; a model of the human body for evaluating the kinematics and dynamics of human motion and strength in microgravity environment; an operational experience data base for verifying and validating the data repository of manned space flights; the Operational Experience Database Taxonomy; and a human-computer interaction laboratory whose products are the display softaware and requirements and the guideline documents and standards for applications on human-computer interaction. Special attention is given to the 'Convoltron', a prototype version of a signal processor for synthesizing the head-related transfer functions.

  18. Building a diabetes screening population data repository using electronic medical records.

    PubMed

    Tuan, Wen-Jan; Sheehy, Ann M; Smith, Maureen A

    2011-05-01

    There has been a rapid advancement of information technology in the area of clinical and population health data management since 2000. However, with the fast growth of electronic medical records (EMRs) and the increasing complexity of information systems, it has become challenging for researchers to effectively access, locate, extract, and analyze information critical to their research. This article introduces an outpatient encounter data framework designed to construct an EMR-based population data repository for diabetes screening research. The outpatient encounter data framework is developed on a hybrid data structure of entity-attribute-value models, dimensional models, and relational models. This design preserves a small number of subject-specific tables essential to key clinical constructs in the data repository. It enables atomic information to be maintained in a transparent and meaningful way to researchers and health care practitioners who need to access data and still achieve the same performance level as conventional data warehouse models. A six-layer information processing strategy is developed to extract and transform EMRs to the research data repository. The data structure also complies with both Health Insurance Portability and Accountability Act regulations and the institutional review board's requirements. Although developed for diabetes screening research, the design of the outpatient encounter data framework is suitable for other types of health service research. It may also provide organizations a tool to improve health care quality and efficiency, consistent with the "meaningful use" objectives of the Health Information Technology for Economic and Clinical Health Act. © 2011 Diabetes Technology Society.

  19. Evaluation of Five Sedimentary Rocks Other Than Salt for Geologic Repository Siting Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croff, A.G.; Lomenick, T.F.; Lowrie, R.S.

    The US Department of Energy (DOE), in order to increase the diversity of rock types under consideration by the geologic disposal program, initiated the Sedimary ROck Program (SERP), whose immediate objectiv eis to evaluate five types of secimdnary rock - sandstone, chalk, carbonate rocks (limestone and dolostone), anhydrock, and shale - to determine the potential for siting a geologic repository. The evaluation of these five rock types, together with the ongoing salt studies, effectively results in the consideration of all types of relatively impermeable sedimentary rock for repository purposes. The results of this evaluation are expressed in terms of amore » ranking of the five rock types with respect to their potential to serve as a geologic repository host rock. This comparative evaluation was conducted on a non-site-specific basis, by use of generic information together with rock evaluation criteria (RECs) derived from the DOE siting guidelines for geologic repositories (CFR 1984). An information base relevant to rock evaluation using these RECs was developed in hydrology, geochemistry, rock characteristics (rock occurrences, thermal response, rock mechanics), natural resources, and rock dissolution. Evaluation against postclosure and preclosure RECs yielded a ranking of the five subject rocks with respect to their potential as repository host rocks. Shale was determined to be the most preferred of the five rock types, with sandstone a distant second, the carbonate rocks and anhydrock a more distant third, and chalk a relatively close fourth.« less

  20. Organizing Diverse, Distributed Project Information

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    2003-01-01

    SemanticOrganizer is a software application designed to organize and integrate information generated within a distributed organization or as part of a project that involves multiple, geographically dispersed collaborators. SemanticOrganizer incorporates the capabilities of database storage, document sharing, hypermedia navigation, and semantic-interlinking into a system that can be customized to satisfy the specific information-management needs of different user communities. The program provides a centralized repository of information that is both secure and accessible to project collaborators via the World Wide Web. SemanticOrganizer's repository can be used to collect diverse information (including forms, documents, notes, data, spreadsheets, images, and sounds) from computers at collaborators work sites. The program organizes the information using a unique network-structured conceptual framework, wherein each node represents a data record that contains not only the original information but also metadata (in effect, standardized data that characterize the information). Links among nodes express semantic relationships among the data records. The program features a Web interface through which users enter, interlink, and/or search for information in the repository. By use of this repository, the collaborators have immediate access to the most recent project information, as well as to archived information. A key advantage to SemanticOrganizer is its ability to interlink information together in a natural fashion using customized terminology and concepts that are familiar to a user community.

  1. Using Linked Open Data and Semantic Integration to Search Across Geoscience Repositories

    NASA Astrophysics Data System (ADS)

    Mickle, A.; Raymond, L. M.; Shepherd, A.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Hitzler, P.; Janowicz, K.; Jones, M.; Krisnadhi, A.; Lehnert, K. A.; Narock, T.; Schildhauer, M.; Wiebe, P. H.

    2014-12-01

    The MBLWHOI Library is a partner in the OceanLink project, an NSF EarthCube Building Block, applying semantic technologies to enable knowledge discovery, sharing and integration. OceanLink is testing ontology design patterns that link together: two data repositories, Rolling Deck to Repository (R2R), Biological and Chemical Oceanography Data Management Office (BCO-DMO); the MBLWHOI Library Institutional Repository (IR) Woods Hole Open Access Server (WHOAS); National Science Foundation (NSF) funded awards; and American Geophysical Union (AGU) conference presentations. The Library is collaborating with scientific users, data managers, DSpace engineers, experts in ontology design patterns, and user interface developers to make WHOAS, a DSpace repository, linked open data enabled. The goal is to allow searching across repositories without any of the information providers having to change how they manage their collections. The tools developed for DSpace will be made available to the community of users. There are 257 registered DSpace repositories in the United Stated and over 1700 worldwide. Outcomes include: Integration of DSpace with OpenRDF Sesame triple store to provide SPARQL endpoint for the storage and query of RDF representation of DSpace resources, Mapping of DSpace resources to OceanLink ontology, and DSpace "data" add on to provide resolvable linked open data representation of DSpace resources.

  2. The Advanced Technology Operations System: ATOS

    NASA Technical Reports Server (NTRS)

    Kaufeler, J.-F.; Laue, H. A.; Poulter, K.; Smith, H.

    1993-01-01

    Mission control systems supporting new space missions face ever-increasing requirements in terms of functionality, performance, reliability and efficiency. Modern data processing technology is providing the means to meet these requirements in new systems under development. During the past few years the European Space Operations Centre (ESOC) of the European Space Agency (ESA) has carried out a number of projects to demonstrate the feasibility of using advanced software technology, in particular, knowledge based systems, to support mission operations. A number of advances must be achieved before these techniques can be moved towards operational use in future missions, namely, integration of the applications into a single system framework and generalization of the applications so that they are mission independent. In order to achieve this goal, ESA initiated the Advanced Technology Operations System (ATOS) program, which will develop the infrastructure to support advanced software technology in mission operations, and provide applications modules to initially support: Mission Preparation, Mission Planning, Computer Assisted Operations, and Advanced Training. The first phase of the ATOS program is tasked with the goal of designing and prototyping the necessary system infrastructure to support the rest of the program. The major components of the ATOS architecture is presented. This architecture relies on the concept of a Mission Information Base (MIB) as the repository for all information and knowledge which will be used by the advanced application modules in future mission control systems. The MIB is being designed to exploit the latest in database and knowledge representation technology in an open and distributed system. In conclusion the technological and implementation challenges expected to be encountered, as well as the future plans and time scale of the project, are presented.

  3. The Center for HIV/AIDS Vaccine Immunology (CHAVI) Multi-site Quality Assurance Program for Cryopreserved Human Peripheral Blood Mononuclear Cells

    PubMed Central

    Sarzotti-Kelsoe, Marcella; Needham, Leila K.; Rountree, Wes; Bainbridge, John; Gray, Clive M.; Fiscus, Susan A.; Ferrari, Guido; Stevens, Wendy S.; Stager, Susan L.; Binz, Whitney; Louzao, Raul; Long, Kristy O.; Mokgotho, Pauline; Moodley, Niranjini; Mackay, Melanie; Kerkau, Melissa; McMillion, Takesha; Kirchherr, Jennifer; Soderberg, Kelly A.; Haynes, Barton F.; Denny, Thomas N.

    2014-01-01

    The Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium was established to determine the host and virus factors associated with HIV transmission, infection and containment of virus replication, with the goal of advancing the development of an HIV protective vaccine. Studies to meet this goal required the use of cryopreserved Peripheral Blood Mononuclear Cell (PBMC) specimens, and therefore it was imperative that a quality assurance (QA) oversight program be developed to monitor PBMC samples obtained from study participants at multiple international sites. Nine site-affiliated laboratories in Africa and the USA collected and processed PBMCs, and cryopreserved PBMC were shipped to CHAVI repositories in Africa and the USA for long-term storage. A three-stage program was designed, based on Good Clinical Laboratory Practices (GCLP), to monitor PBMC integrity at each step of this process. The first stage evaluated the integrity of fresh PBMCs for initial viability, overall yield, and processing time at the site-affiliated laboratories (Stage 1); for the second stage, the repositories determined post-thaw viability and cell recovery of cryopreserved PBMC, received from the site-affiliated laboratories (Stage 2); the third stage assessed the long-term specimen storage at each repository (Stage 3). Overall, the CHAVI PBMC QA oversight program results highlight the relative importance of each of these stages to the ultimate goal of preserving specimen integrity from peripheral blood collection to long-term repository storage. PMID:24910414

  4. Software aspects of the Geant4 validation repository

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto

    2017-10-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  5. Software Aspects of the Geant4 Validation Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel

    2016-01-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  6. Thermoelastic analysis of spent fuel and high level radioactive waste repositories in salt. A semi-analytical solution. [JUDITH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St. John, C.M.

    1977-04-01

    An underground repository containing heat generating, High Level Waste or Spent Unreprocessed Fuel may be approximated as a finite number of heat sources distributed across the plane of the repository. The resulting temperature, displacement and stress changes may be calculated using analytical solutions, providing linear thermoelasticity is assumed. This report documents a computer program based on this approach and gives results that form the basis for a comparison between the effects of disposing of High Level Waste and Spent Unreprocessed Fuel.

  7. A science and technology initiative within the office of civilian radioactive waste management

    USGS Publications Warehouse

    Budnitz, R.J.; Kiess, T.E.; Peters, M.; Duncan, D.

    2003-01-01

    In 2002, by following a national decision-making process that had been specified in the 1982 Nuclear Waste Policy Act, Yucca Mountain (YM) was designated as the site for the nation's geologic repository for commercial spent nuclear fuel (SNF). The U.S. Department of Energy's (DOE's) Office of Civilian Radioactive Waste Management (OCRWM) must now obtain regulatory approval to construct and operate a repository there, and to develop transportation and infrastructure needed to support operations. The OCRWM has also recently begun a separate Science and Technology (S&T) initiative, whose purposes, beginnings, current projects, and future plans are described here.

  8. Managing the nation`s nuclear waste. Site descriptions: Cypress Creek, Davis Canyon, Deaf Smith, Hanford Reference, Lavender Canyon, Richton Dome, Swisher, Vacherie Dome, and Yucca Mountain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1985-12-31

    In 1982, the Congress enacted the Nuclear Waste Policy Act (Public Law 97-425), which established a comprehensive national program directed toward siting, constructing, and operating geologic repositories for the permanent disposal of high-level radioactive waste. In February 1983, the United States Department of Energy (DOE) identified the nine referenced repository locations as potentially acceptable sites for a mined geologic repository. These sites have been evaluated in accordance with the DOE`s General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories. The DOE findings and determinations are based on the evaluations contained in the draft Environmental Assessments (EA). A finalmore » EA will be prepared after considering the comments received on the draft EA. The purpose of this document is to provide the public with specific site information on each potential repository location.« less

  9. Maximizing the Utility of the Serum Repository With Current Technologies and Recommendations to Meet Future Needs: Report of the Technical Panel.

    PubMed

    Baird, Coleen P

    2015-10-01

    The Department of Defense Serum Repository (DoDSR) of the Armed Forces Health Surveillance Center (AFHSC), Silver Spring, Maryland, has over 55 million specimens. Over 80% of these specimens are linked to individual health data. In response to Congressional and Department of Defense (DoD) concern about toxic exposures of deployed Service members and rapidly developing laboratory capabilities that may identify those exposed, the AFHSC hosted two panels in 2013. The first, the Needs Panel, focused on assessing the needs of the DoD that may be met using the current DoDSR and an enhanced repository. The second panel, the Technical Panel, focused on identifying the emerging laboratory technologies that are or will be available to DoD public health workers and researchers. This report summarizes the recommendations of the Technical Panel, to include identified gaps in the ability of the current DoDSR to address questions of interest to the DoD, the availability of laboratory technology to address these needs, and the types and quality of specimens required from Service members possibly exposed. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  10. Towards a semantic PACS: Using Semantic Web technology to represent imaging data.

    PubMed

    Van Soest, Johan; Lustberg, Tim; Grittner, Detlef; Marshall, M Scott; Persoon, Lucas; Nijsten, Bas; Feltens, Peter; Dekker, Andre

    2014-01-01

    The DICOM standard is ubiquitous within medicine. However, improved DICOM semantics would significantly enhance search operations. Furthermore, databases of current PACS systems are not flexible enough for the demands within image analysis research. In this paper, we investigated if we can use Semantic Web technology, to store and represent metadata of DICOM image files, as well as linking additional computational results to image metadata. Therefore, we developed a proof of concept containing two applications: one to store commonly used DICOM metadata in an RDF repository, and one to calculate imaging biomarkers based on DICOM images, and store the biomarker values in an RDF repository. This enabled us to search for all patients with a gross tumor volume calculated to be larger than 50 cc. We have shown that we can successfully store the DICOM metadata in an RDF repository and are refining our proof of concept with regards to volume naming, value representation, and the applications themselves.

  11. 77 FR 73669 - Response to Comments Received for the “The Menlo Report: Ethical Principles Guiding Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ... Technology Research'' (``The Menlo Report'') for the Department of Homeland Security (DHS), Science and Technology, Cyber Security Division (CSD), Protected Repository for the Defense of Infrastructure Against Cyber Threats (PREDICT) Project AGENCY: Science and Technology Directorate, DHS. ACTION: Response...

  12. The visualization and availability of experimental research data at Elsevier

    NASA Astrophysics Data System (ADS)

    Keall, Bethan

    2014-05-01

    In the digital age, the visualization and availability of experimental research data is an increasingly prominent aspect of the research process and of the scientific output that researchers generate. We expect that the importance of data will continue to grow, driven by technological advancements, requirements from funding bodies to make research data available, and a developing research data infrastructure that is supported by data repositories, science publishers, and other stakeholders. Elsevier is actively contributing to these efforts, for example by setting up bidirectional links between online articles on ScienceDirect and relevant data sets on trusted data repositories. A key aspect of Elsevier's "Article of the Future" program, these links enrich the online article and make it easier for researchers to find relevant data and articles and help place data in the right context for re-use. Recently, we have set up such links with some of the leading data repositories in Earth Sciences, including the British Geological Survey, Integrated Earth Data Applications, the UK Natural Environment Research Council, and the Oak Ridge National Laboratory DAAC. Building on these links, Elsevier has also developed a number of data integration and visualization tools, such as an interactive map viewer that displays the locations of relevant data from PANGAEA next to articles on ScienceDirect. In this presentation we will give an overview of these and other capabilities of the Article of the Future, focusing on how they help advance communication of research in the digital age.

  13. Rolling Deck to Repository (R2R): Products and Services for the U.S. Research Fleet Community

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.

    2016-02-01

    The Rolling Deck to Repository (R2R) program is working to ensure open access to environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 TB/year of data to R2R from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R ensures these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R has recently expanded to include the vessels Sikuliaq, operated by the University of Alaska; Falkor, operated by the Schmidt Ocean Institute; and Ronald H. Brown and Okeanos Explorer, operated by NOAA. R2R maintains a master catalog of U.S. research cruises, currently holding over 4,670 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. Standard post-field cruise products are published including shiptrack navigation, near-real-time MET/TSG data, underway geophysical profiles, and CTD profiles. Software tools available to users include the R2R Event Logger and the R2R Nav Manager. A Digital Object Identifier (DOI) is published for each cruise, original field sensor dataset, standard post-field product, and document (e.g. cruise report) submitted by the science party. Scientists are linked to personal identifiers such as ORCIDs where available. Using standard identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. R2R collaborates in the Ocean Data Interoperability Platform (ODIP) to strengthen links among regional and national data systems, populates U.S. cruises in the POGO global catalog, and is working toward membership in the DataONE alliance. It is a lead partner in the EarthCube GeoLink project, developing Semantic Web technologies to share data and documentation between repositories, and in the newly-launched EarthCube SeaView project, delivering data from R2R and other ocean data facilities to scientists using the Ocean Data View (ODV) software tool.

  14. GeoSpatial Workforce Development: enhancing the traditional learning environment in geospatial information technology

    NASA Astrophysics Data System (ADS)

    Lawhead, Pamela B.; Aten, Michelle L.

    2003-04-01

    The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.

  15. Using neural networks in software repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.

    1992-01-01

    The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.

  16. Standards-based metadata procedures for retrieving data for display or mining utilizing persistent (data-DOI) identifiers.

    PubMed

    Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S

    2015-01-01

    We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.

  17. Repository-based software engineering program: Concept document

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  18. Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.

    2016-12-01

    There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.

  19. NA-42 TI Shared Software Component Library FY2011 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.

    The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less

  20. Ocean Drilling Program: Completed Legs

    Science.gov Websites

    . Austin Leg summary Repository Wolfgang Schlager 102 14-Mar-85 25-Apr-85 Miami, Florida 418 Bermuda Rise Lisbon, Portugal 902-906 New Jersey Sea-Level Transect Peter Blum Gregory Mountain Leg summary Repository , Nova Scotia 1071-1073 Continuing the New Jersey Sea-Level Transect Mitchell J. Malone James A. Austin

  1. Design and Implementation of an International Training Program on Repository Development and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, K.W.; Twitchell, Ch.A.

    2008-07-01

    Korea Hydro and Nuclear Power Co., Ltd. (KHNP) is an electric company in the Republic of Korea with twenty operational nuclear power plants and eight additional units that are either planned or currently under construction. Regulations require that KHNP manage the radioactive waste generated by their nuclear power plants. In the course of planning low, intermediate, and high level waste storage facilities, KHNP sought interaction with an acknowledged expert in the field of radioactive waste management and, consequently, contacted Sandia National Laboratories (SNL). KHNP has contracted with SNL to provide a year long training program on repository science. This papermore » discusses the design of the curriculum, specific plans for execution of the training program, and recommendations for smooth implementation of international training programs. (authors)« less

  2. Monte Carlo simulations for generic granite repository studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Shaoping; Lee, Joon H; Wang, Yifeng

    In a collaborative study between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) for the DOE-NE Office of Fuel Cycle Technologies Used Fuel Disposition (UFD) Campaign project, we have conducted preliminary system-level analyses to support the development of a long-term strategy for geologic disposal of high-level radioactive waste. A general modeling framework consisting of a near- and a far-field submodel for a granite GDSE was developed. A representative far-field transport model for a generic granite repository was merged with an integrated systems (GoldSim) near-field model. Integrated Monte Carlo model runs with the combined near- and farfield transport modelsmore » were performed, and the parameter sensitivities were evaluated for the combined system. In addition, a sub-set of radionuclides that are potentially important to repository performance were identified and evaluated for a series of model runs. The analyses were conducted with different waste inventory scenarios. Analyses were also conducted for different repository radionuelide release scenarios. While the results to date are for a generic granite repository, the work establishes the method to be used in the future to provide guidance on the development of strategy for long-term disposal of high-level radioactive waste in a granite repository.« less

  3. Clone and genomic repositories at the American Type Culture Collection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maglott, D.R.; Nierman, W.C.

    1990-01-01

    The American Type Culture Collection (ATCC) has a long history of characterizing, preserving, and distributing biological resource materials for the scientific community. Starting in 1925 as a repository for standard bacterial and fungal strains, its collections have diversified with technologic advances and in response to the requirements of its users. To serve the needs of the human genetics community, the National Institute of Child Health and Human Development (NICHD), National Institutes of Health (NIH), established an international Repository of Human DNA Probes and Libraries at the ATCC in 1985. This repository expanded the existing collections of recombinant clones and librariesmore » at the ATCC, with the specific purposes of (1) obtaining, amplifying, and distribution probes detecting restriction fragment length polymorphisms (RFLPs); (2) obtaining, amplifying, and distributing genomic and cDNA clones from known genes independent of RFLP detection; (3) distributing the chromosome-specific libraries generated by the National Laboratory Gene Library Project at the Lawrence Livermore and Los Alamos National Laboratories and (4) maintaining a public, online database describing the repository materials. Because it was recognized that animal models and comparative mapping can be crucial to genomic characterization, the scope of the repository was broadened in February 1989 to include probes from the mouse genome.« less

  4. Java-based browsing, visualization and processing of heterogeneous medical data from remote repositories.

    PubMed

    Masseroli, M; Bonacina, S; Pinciroli, F

    2004-01-01

    The actual development of distributed information technologies and Java programming enables employing them also in the medical arena to support the retrieval, integration and evaluation of heterogeneous data and multimodal images in a web browser environment. With this aim, we used them to implement a client-server architecture based on software agents. The client side is a Java applet running in a web browser and providing a friendly medical user interface to browse and visualize different patient and medical test data, integrating them properly. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. Based on the Java Advanced Imaging API, processing and analysis tools were developed to support the evaluation of remotely retrieved bioimages through the quantification of their features in different regions of interest. The Java platform-independence allows the centralized management of the implemented prototype and its deployment to each site where an intranet or internet connection is available. Giving healthcare providers effective support for comprehensively browsing, visualizing and evaluating medical images and records located in different remote repositories, the developed prototype can represent an important aid in providing more efficient diagnoses and medical treatments.

  5. The Telemetric and Holter ECG Warehouse Initiative (THEW): a Data Repository for the Design, Implementation and Validation of ECG-related Technologies

    PubMed Central

    Couderc, Jean-Philippe

    2011-01-01

    We present an initiative supported by the National Heart Lung, and Blood Institute and the Food and Drug Administration for the development of a repository containing continuous electrocardiographic information to be shared with the worldwide scientific community. We believe that sharing data reinforces open scientific inquiry. It encourages diversity of analysis and opinion while promoting new research and facilitating the education of new researchers. In this paper, we present the resources available in this initiative for the scientific community. We describe the set of ECG signals currently hosted and we briefly discuss the associated clinical information (medical history. Disease and study-specific endpoints) and software tools we propose. Currently, the repository contains more than 250GB of data from eight clinical studies including healthy individuals and cardiac patients. This data is available for the development, implementation and validation of technologies related to body-surface ECGs. To conclude, the Telemetric and Holter ECG Warehouse (THEW) is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. PMID:21097349

  6. Coupling Effects of Heat and Moisture on the Saturation Processes of Buffer Material in a Deep Geological Repository

    NASA Astrophysics Data System (ADS)

    Huang, Wei-Hsing

    2017-04-01

    Clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation behavior of clay barrier, with emphasis on the coupling effects of heat and moisture of buffer material in the near-field of a repository during groundwater intrusion processes. A locally available clay named "Zhisin clay" and a standard bentotine material were adopted in the laboratory program. Water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the buffer material. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. The finite element program ABAQUS was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on Zhisin clay. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. It was found that, due to the variation in suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.

  7. Donor human milk bank data collection in north america: an assessment of current status and future needs.

    PubMed

    Brownell, Elizabeth A; Lussier, Mary M; Herson, Victor C; Hagadorn, James I; Marinelli, Kathleen A

    2014-02-01

    The Human Milk Banking Association of North America (HMBANA) is a nonprofit association that standardizes and facilitates the establishment and operation of donor human milk (DHM) banks in North America. Each HMBANA milk bank in the network collects data on the DHM it receives and distributes, but a centralized data repository does not yet exist. In 2010, the Food and Drug Administration recognized the need to collect and disseminate systematic, standardized DHM bank data and suggested that HMBANA develop a DHM data repository. This study aimed to describe data currently collected by HMBANA DHM banks and evaluate feasibility and interest in participating in a centralized data repository. We conducted phone interviews with individuals in different HMBANA milk banks and summarized descriptive statistics. Eight of 13 (61.5%) sites consented to participate. All respondents collected donor demographics, and half (50%; n = 4) rescreened donors after 6 months of continued donation. The definition of preterm milk varied between DHM banks (≤ 32 to ≤ 40 weeks). The specific computer program used to house the data also differed. Half (50%; n = 4) indicated that they would consider participation in a centralized repository. Without standardized data across all HMBANA sites, the creation of a centralized data repository is not yet feasible. Lack of standardization and transparency may deter implementation of donor milk programs in the neonatal intensive care unit setting and hinder benchmarking, research, and quality improvement initiatives.

  8. NASA Life Sciences Data Repositories: Tools for Retrospective Analysis and Future Planning

    NASA Technical Reports Server (NTRS)

    Thomas, D.; Wear, M.; VanBaalen, M.; Lee, L.; Fitts, M.

    2011-01-01

    As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future research studies.

  9. Worldwide nanotechnology development: a comparative study of USPTO, EPO, and JPO patents (1976-2004)

    NASA Astrophysics Data System (ADS)

    Li, Xin; Lin, Yiling; Chen, Hsinchun; Roco, Mihail C.

    2007-12-01

    To assess worldwide development of nanotechnology, this paper compares the numbers and contents of nanotechnology patents in the United States Patent and Trademark Office (USPTO), European Patent Office (EPO), and Japan Patent Office (JPO). It uses the patent databases as indicators of nanotechnology trends via bibliographic analysis, content map analysis, and citation network analysis on nanotechnology patents per country, institution, and technology field. The numbers of nanotechnology patents published in USPTO and EPO have continued to increase quasi-exponentially since 1980, while those published in JPO stabilized after 1993. Institutions and individuals located in the same region as a repository's patent office have a higher contribution to the nanotechnology patent publication in that repository ("home advantage" effect). The USPTO and EPO databases had similar high-productivity contributing countries and technology fields with large number of patents, but quite different high-impact countries and technology fields after the average number of received cites. Bibliographic analysis on USPTO and EPO patents shows that researchers in the United States and Japan published larger numbers of patents than other countries, and that their patents were more frequently cited by other patents. Nanotechnology patents covered physics research topics in all three repositories. In addition, USPTO showed the broadest representation in coverage in biomedical and electronics areas. The analysis of citations by technology field indicates that USPTO had a clear pattern of knowledge diffusion from highly cited fields to less cited fields, while EPO showed knowledge exchange mainly occurred among highly cited fields.

  10. Yucca Mountain Biological Resources Monitoring Program; Annual report, FY91

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-01-01

    The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize Yucca Mountain as a possible site for a geologic repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a repository. To ensure that site characterization activities (SCA) do not adversely affect the Yucca Mountain area, an environmental program has been implemented to monitor and mitigate potential impacts and to ensure that activities comply with applicable environmentalmore » regulations. This report describes the activities and accomplishments during fiscal year 1991 (FY91) for six program areas within the Terrestrial Ecosystem component of the YMP environmental program. The six program areas are Site Characterization Activities Effects, Desert Tortoises, Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less

  11. Coupling Legacy and Contemporary Deterministic Codes to Goldsim for Probabilistic Assessments of Potential Low-Level Waste Repository Sites

    NASA Astrophysics Data System (ADS)

    Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.

    2006-12-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer project between Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research (INER) for the preliminary assessment of several candidate low-level waste repository sites. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.

  12. Multi-institutional tumor banking: lessons learned from a pancreatic cancer biospecimen repository.

    PubMed

    Demeure, Michael J; Sielaff, Timothy; Koep, Larry; Prinz, Richard; Moser, A James; Zeh, Herb; Hostetter, Galen; Black, Jodi; Decker, Ardis; Rosewell, Sandra; Bussey, Kimberly J; Von Hoff, Daniel

    2010-10-01

    Clinically annotated pancreatic cancer samples are needed for progress to be made toward developing more effective treatments for this deadly cancer. As part of a National Cancer Institute-funded program project, we established a biospecimen core to support the research efforts. This article summarizes the key hurdles encountered and solutions we found in the process of developing a successful multi-institution biospecimen repository.

  13. Positioning a Paediatric Compounded Non-Sterile Product Electronic Repository (pCNPeRx) within the Health Information Technology Infrastructure

    PubMed Central

    Parrish, Richard H.

    2015-01-01

    Numerous gaps in the current medication use system impede complete transmission of electronically identifiable and standardized extemporaneous formulations as well as a uniform approach to medication therapy management (MTM) for paediatric patients. The Pharmacy Health Information Technology Collaborative (Pharmacy HIT) identified six components that may have direct importance for pharmacy related to medication use in children. This paper will discuss key positions within the information technology infrastructure (HIT) where an electronic repository for the medication management of paediatric patients’ compounded non-sterile products (pCNP) and care provision could be housed optimally to facilitate and maintain transmission of e-prescriptions (eRx) from initiation to fulfillment. Further, the paper will propose key placement requirements to provide for maximal interoperability of electronic medication management systems to minimize disruptions across the continuum of care. PMID:28970375

  14. Scanning and georeferencing historical USGS quadrangles

    USGS Publications Warehouse

    Fishburn, Kristin A.; Davis, Larry R.; Allord, Gregory J.

    2017-06-23

    The U.S. Geological Survey (USGS) National Geospatial Program is scanning published USGS 1:250,000-scale and larger topographic maps printed between 1884, the inception of the topographic mapping program, and 2006. The goal of this project, which began publishing the Historical Topographic Map Collection in 2011, is to provide access to a digital repository of USGS topographic maps that is available to the public at no cost. For more than 125 years, USGS topographic maps have accurately portrayed the complex geography of the Nation. The USGS is the Nation’s largest producer of traditional topographic maps, and, prior to 2006, USGS topographic maps were created using traditional cartographic methods and printed using a lithographic process. The next generation of topographic maps, US Topo, is being released by the USGS in digital form, and newer technologies make it possible to also deliver historical maps in the same electronic format that is more publicly accessible.

  15. Electronic theses and dissertations: a review of this valuable resource for nurse scholars worldwide.

    PubMed

    Goodfellow, L M

    2009-06-01

    A worldwide repository of electronic theses and dissertations (ETDs) could provide worldwide access to the most up-to-date research generated by masters and doctoral students. Until that international repository is established, it is possible to access some of these valuable knowledge resources. ETDs provide a technologically advanced medium with endless multimedia capabilities that far exceed the print and bound copies of theses and dissertations housed traditionally in individual university libraries. CURRENT USE: A growing trend exists for universities worldwide to require graduate students to submit theses or dissertations as electronic documents. However, nurse scholars underutilize ETDs, as evidenced by perusing bibliographic citation lists in many of the research journals. ETDs can be searched for and retrieved through several digital resources such as the Networked Digital Library of Theses and Dissertations (http://www.ndltd.org), ProQuest Dissertations and Theses (http://www.umi.com), the Australasian Digital Theses Program (http://adt.caul.edu.au/) and through individual university web sites and online catalogues. An international repository of ETDs benefits the community of nurse scholars in many ways. The ability to access recent graduate students' research electronically from anywhere in the world is advantageous. For scholars residing in developing countries, access to these ETDs may prove to be even more valuable. In some cases, ETDs are not available for worldwide access and can only be accessed through the university library from which the student graduated. Public access to university library ETD collections is not always permitted. Nurse scholars from both developing and developed countries could benefit from ETDs.

  16. Development of the performance confirmation program at YUCCA mountain, nevada

    USGS Publications Warehouse

    LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.

    2006-01-01

    The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.

  17. 10 CFR 63.144 - Quality assurance program change.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2013-01-01 2013-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...

  18. 10 CFR 63.144 - Quality assurance program change.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2014-01-01 2014-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...

  19. 10 CFR 63.144 - Quality assurance program change.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... assurance program information that duplicates language in quality assurance regulatory guides and quality... 10 Energy 2 2012-01-01 2012-01-01 false Quality assurance program change. 63.144 Section 63.144... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.144 Quality assurance program change. Changes to...

  20. Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Gomez, S. P.; Matteo, E. N.

    2017-12-01

    Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.

  1. LTPP InfoPave

    DOT National Transportation Integrated Search

    2015-12-29

    The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...

  2. Getting to Know You: Discovering User Behaviors and Their Implications for Service Design

    ERIC Educational Resources Information Center

    Daigle, Ben

    2013-01-01

    Public services librarians are often in the position of training patrons how to use technology. They adopt new technologies such as discovery layers, link resolvers, subject guides, virtual reference services, OPACs, content management systems, and institutional repositories to provide access to materials and facilitate collaboration, but…

  3. ADVANCED NUCLEAR FUEL CYCLE EFFECTS ON THE TREATMENT OF UNCERTAINTY IN THE LONG-TERM ASSESSMENT OF GEOLOGIC DISPOSAL SYSTEMS - EBS INPUT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, M; Blink, J A; Greenberg, H R

    2012-04-25

    The Used Fuel Disposition (UFD) Campaign within the Department of Energy's Office of Nuclear Energy (DOE-NE) Fuel Cycle Technology (FCT) program has been tasked with investigating the disposal of the nation's spent nuclear fuel (SNF) and high-level nuclear waste (HLW) for a range of potential waste forms and geologic environments. The planning, construction, and operation of a nuclear disposal facility is a long-term process that involves engineered barriers that are tailored to both the geologic environment and the waste forms being emplaced. The UFD Campaign is considering a range of fuel cycles that in turn produce a range of wastemore » forms. The UFD Campaign is also considering a range of geologic media. These ranges could be thought of as adding uncertainty to what the disposal facility design will ultimately be; however, it may be preferable to thinking about the ranges as adding flexibility to design of a disposal facility. For example, as the overall DOE-NE program and industrial actions result in the fuel cycles that will produce waste to be disposed, and the characteristics of those wastes become clear, the disposal program retains flexibility in both the choice of geologic environment and the specific repository design. Of course, other factors also play a major role, including local and State-level acceptance of the specific site that provides the geologic environment. In contrast, the Yucca Mountain Project (YMP) repository license application (LA) is based on waste forms from an open fuel cycle (PWR and BWR assemblies from an open fuel cycle). These waste forms were about 90% of the total waste, and they were the determining waste form in developing the engineered barrier system (EBS) design for the Yucca Mountain Repository design. About 10% of the repository capacity was reserved for waste from a full recycle fuel cycle in which some actinides were extracted for weapons use, and the remaining fission products and some minor actinides were encapsulated in borosilicate glass. Because the heat load of the glass was much less than the PWR and BWR assemblies, the glass waste form was able to be co-disposed with the open cycle waste, by interspersing glass waste packages among the spent fuel assembly waste packages. In addition, the Yucca Mountain repository was designed to include some research reactor spent fuel and naval reactor spent fuel, within the envelope that was set using the commercial reactor assemblies as the design basis waste form. This milestone report supports Sandia National Laboratory milestone M2FT-12SN0814052, and is intended to be a chapter in that milestone report. The independent technical review of this LLNL milestone was performed at LLNL and is documented in the electronic Information Management (IM) system at LLNL. The objective of this work is to investigate what aspects of quantifying, characterizing, and representing the uncertainty associated with the engineered barrier are affected by implementing different advanced nuclear fuel cycles (e.g., partitioning and transmutation scenarios) together with corresponding designs and thermal constraints.« less

  4. Defining the Relationship Between Human Error Classes and Technology Intervention Strategies

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.; Rantanen, Esa; Crisp, Vicki K. (Technical Monitor)

    2002-01-01

    One of the main factors in all aviation accidents is human error. The NASA Aviation Safety Program (AvSP), therefore, has identified several human-factors safety technologies to address this issue. Some technologies directly address human error either by attempting to reduce the occurrence of errors or by mitigating the negative consequences of errors. However, new technologies and system changes may also introduce new error opportunities or even induce different types of errors. Consequently, a thorough understanding of the relationship between error classes and technology "fixes" is crucial for the evaluation of intervention strategies outlined in the AvSP, so that resources can be effectively directed to maximize the benefit to flight safety. The purpose of the present project, therefore, was to examine the repositories of human factors data to identify the possible relationship between different error class and technology intervention strategies. The first phase of the project, which is summarized here, involved the development of prototype data structures or matrices that map errors onto "fixes" (and vice versa), with the hope of facilitating the development of standards for evaluating safety products. Possible follow-on phases of this project are also discussed. These additional efforts include a thorough and detailed review of the literature to fill in the data matrix and the construction of a complete database and standards checklists.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, Frank Vinton; Kelley, Richard E.

    The DOE Spent Fuel and Waste Technology (SWFT) R&D Campaign is supporting research on crystalline rock, shale (argillite) and salt as potential host rocks for disposal of HLW and SNF in a mined geologic repository. The distribution of these three potential repository host rocks is limited to specific regions of the US and to different geologic and hydrologic environments (Perry et al., 2014), many of which may be technically suitable as a site for mined geologic disposal. This report documents a regional geologic evaluation of the Pierre Shale, as an example of evaluating a potentially suitable shale for siting amore » geologic HLW repository. This report follows a similar report competed in 2016 on a regional evaluation of crystalline rock that focused on the Superior Province of the north-central US (Perry et al., 2016).« less

  6. Data mining in newt-omics, the repository for omics data from the newt.

    PubMed

    Looso, Mario; Braun, Thomas

    2015-01-01

    Salamanders are an excellent model organism to study regenerative processes due to their unique ability to regenerate lost appendages or organs. Straightforward bioinformatics tools to analyze and take advantage of the growing number of "omics" studies performed in salamanders were lacking so far. To overcome this limitation, we have generated a comprehensive data repository for the red-spotted newt Notophthalmus viridescens, named newt-omics, merging omics style datasets on the transcriptome and proteome level including expression values and annotations. The resource is freely available via a user-friendly Web-based graphical user interface ( http://newt-omics.mpi-bn.mpg.de) that allows access and queries to the database without prior bioinformatical expertise. The repository is updated regularly, incorporating new published datasets from omics technologies.

  7. A Repository of Codes of Ethics and Technical Standards in Health Informatics

    PubMed Central

    Zaïane, Osmar R.

    2014-01-01

    We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository. PMID:25422725

  8. A perspective on the proliferation risks of plutonium mines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyman, E.S.

    1996-05-01

    The program of geologic disposal of spent fuel and other plutonium-containing materials is increasingly becoming the target of criticism by individuals who argue that in the future, repositories may become low-cost sources of fissile material for nuclear weapons. This paper attempts to outline a consistent framework for analyzing the proliferation risks of these so-called {open_quotes}plutonium mines{close_quotes} and putting them into perspective. First, it is emphasized that the attractiveness of plutonium in a repository as a source of weapons material depends on its accessibility relative to other sources of fissile material. Then, the notion of a {open_quotes}material production standard{close_quotes} (MPS) ismore » proposed: namely, that the proliferation risks posed by geologic disposal will be acceptable if one can demonstrate, under a number of reasonable scenarios, that the recovery of plutonium from a repository is likely to be as difficult as new production of fissile material. A preliminary analysis suggests that the range of circumstances under which current mined repository concepts would fail to meet this standard is fairly narrow. Nevertheless, a broad application of the MPS may impose severe restrictions on repository design. In this context, the relationship of repository design parameters to easy of recovery is discussed.« less

  9. 10 CFR 60.161 - Training and certification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Training and certification program. 60.161 Section 60.161 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Training and Certification of Personnel § 60.161 Training and certification program. DOE shall...

  10. SeaView: bringing EarthCube to the Oceanographer

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Diggs, S. C.; Arko, R. A.; Kinkade, D.; Shepherd, A.

    2016-12-01

    As new instrument types are developed, and new observational programs start, that support a growing community of "dry" oceanographers, the ability to find, access, and visualize existing data of interest becomes increasingly critical. Yet ocean data, when available, is are held in multiple data facilities, in different formats, and accessible through different pathways. This creates practical problems with integrating and working across different data sets. The SeaView project is building connections between the rich data resources in five major oceanographic data facilities - BCO-DMO, CCHDO, OBIS, OOI, and R2R* - creating a federated set of thematic data collections that are organized around common characteristics (geographic location, time, expedition, program, data type, etc.) and published online in Web Accessible Folders using standard file formats such as ODV and NetCDF. The work includes not simply reformatting data, but identifying and, where possible, addressing interoperability challenges: which common identifiers for core concepts can connect data across repositories, which terms a scientist may want to search that, if added to the data repositories, will increase discoverability; the presence of duplicate data across repositories, etc. We will present the data collections available to date, including data from the OOI Pioneer Array region, and seek scientists' input on the data types and formats they prefer, the tools they use to analyze and visualize data, and their specific recommendations for future data collections to support oceanographic science. * Biological and Chemical Oceanography Data Management Office (BCO-DMO), CLIVAR and Carbon Hydrographic Data Office (CCHDO), International Ocean Biogeographic Information System (iOBIS), Ocean Observatories Initiative (OOI), and Rolling Deck to Repository (R2R) Program.

  11. Industrial Program of Waste Management - Cigeo Project - 13033

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butez, Marc; Bartagnon, Olivier; Gagner, Laurent

    2013-07-01

    The French Planning Act of 28 June 2006 prescribed that a reversible repository in a deep geological formation be chosen as the reference solution for the long-term management of high-level and intermediate-level long-lived radioactive waste. It also entrusted the responsibility of further studies and design of the repository (named Cigeo) upon the French Radioactive Waste Management Agency (Andra), in order for the review of the creation-license application to start in 2015 and, subject to its approval, the commissioning of the repository to take place in 2025. Andra is responsible for siting, designing, implementing, operating the future geological repository, including operationalmore » and long term safety and waste acceptance. Nuclear operators (Electricite de France (EDF), AREVA NC, and the French Commission in charge of Atomic Energy and Alternative Energies (CEA) are technically and financially responsible for the waste they generate, with no limit in time. They provide Andra, on one hand, with waste packages related input data, and on the other hand with their long term industrial experiences of high and intermediate-level long-lived radwaste management and nuclear operation. Andra, EDF, AREVA and CEA established a cooperation agreement for strengthening their collaborations in these fields. Within this agreement Andra and the nuclear operators have defined an industrial program for waste management. This program includes the waste inventory to be taken into account for the design of the Cigeo project and the structural hypothesis underlying its phased development. It schedules the delivery of the different categories of waste and defines associated flows. (authors)« less

  12. A Software Architecture for Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.

  13. Technology and Health Information Technology in Colorectal Surgery: Electronic Literature Support

    PubMed Central

    Magruder, J. Trent; Efron, Jonathan E.

    2013-01-01

    The advent of the Internet has revolutionized the management of reporting and accessing research and data. The authors review the current resources available to surgeons through websites, accumulated published data repositories, and libraries. The change in how we publish and present peer-reviewed data over the last 20 years is also discussed as well as the future of health information technology. PMID:24436645

  14. Preliminary Concept of Operations for the Spent Fuel Management System--WM2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cumberland, Riley M; Adeniyi, Abiodun Idowu; Howard, Rob L

    The Nuclear Fuels Storage and Transportation Planning Project (NFST) within the U.S. Department of Energy s Office of Nuclear Energy is tasked with identifying, planning, and conducting activities to lay the groundwork for developing interim storage and transportation capabilities in support of an integrated waste management system. The system will provide interim storage for commercial spent nuclear fuel (SNF) from reactor sites and deliver it to a repository. The system will also include multiple subsystems, potentially including; one or more interim storage facilities (ISF); one or more repositories; facilities to package and/or repackage SNF; and transportation systems. The project teammore » is analyzing options for an integrated waste management system. To support analysis, the project team has developed a Concept of Operations document that describes both the potential integrated system and inter-dependencies between system components. The goal of this work is to aid systems analysts in the development of consistent models across the project, which involves multiple investigators. The Concept of Operations document will be updated periodically as new developments emerge. At a high level, SNF is expected to travel from reactors to a repository. SNF is first unloaded from reactors and placed in spent fuel pools for wet storage at utility sites. After the SNF has cooled enough to satisfy loading limits, it is placed in a container at reactor sites for storage and/or transportation. After transportation requirements are met, the SNF is transported to an ISF to store the SNF until a repository is developed or directly to a repository if available. While the high level operation of the system is straightforward, analysts must evaluate numerous alternative options. Alternative options include the number of ISFs (if any), ISF design, the stage at which SNF repackaging occurs (if any), repackaging technology, the types of containers used, repository design, component sizing, and timing of events. These alternative options arise due to technological, economic, or policy considerations. As new developments regularly emerge, the operational concepts will be periodically updated. This paper gives an overview of the different potential alternatives identified in the Concept of Operations document at a conceptual level.« less

  15. A Proposal to Enhance the Use of Learning Platforms in Higher Education

    ERIC Educational Resources Information Center

    Marques, Bertil P.; Villate, Jaime E.; Vaz de Carvalho, Carlos

    2015-01-01

    The results of several studies conducted to analyze the quantitative and qualitative use of learning technologies in Higher Education in Portugal showed that, in general, these technologies are not used systematically and effectively and e-learning platforms tend to be relegated to repositories of contents rather than as full-fledged tools…

  16. Supporting Student Research with Semantic Technologies and Digital Archives

    ERIC Educational Resources Information Center

    Martinez-Garcia, Agustina; Corti, Louise

    2012-01-01

    This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…

  17. LTPP InfoPave Release 2017: What's New

    DOT National Transportation Integrated Search

    2017-01-01

    The LTPP program was initiated in 1987 to satisfy a wide range of pavement information needs. Over the years, the program has accumulated a vast repository of research quality data, extensive documentation, and related tools, which compose LTPPs c...

  18. International Approaches for Nuclear Waste Disposal in Geological Formations: Report on Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter

    2016-08-01

    An important issue for present and future generations is the final disposal of spent nuclear fuel. Over the past over forty years, the development of technologies to isolate both spent nuclear fuel (SNF) and other high-level nuclear waste (HLW) generated at nuclear power plants and from production of defense materials, and low- and intermediate-level nuclear waste (LILW) in underground rock and sediments has been found to be a challenging undertaking. Finding an appropriate solution for the disposal of nuclear waste is an important issue for protection of the environment and public health, and it is a prerequisite for the futuremore » of nuclear power. The purpose of a deep geological repository for nuclear waste is to provide to future generations, protection against any harmful release of radioactive material, even after the memory of the repository may have been lost, and regardless of the technical knowledge of future generations. The results of a wide variety of investigations on the development of technology for radioactive waste isolation from 19 countries were published in the First Worldwide Review in 1991 (Witherspoon, 1991). The results of investigations from 26 countries were published in the Second Worldwide Review in 1996 (Witherspoon, 1996). The results from 32 countries were summarized in the Third Worldwide Review in 2001 (Witherspoon and Bodvarsson, 2001). The last compilation had results from 24 countries assembled in the Fourth Worldwide Review (WWR) on radioactive waste isolation (Witherspoon and Bodvarsson, 2006). Since publication of the last report in 2006, radioactive waste disposal approaches have continued to evolve, and there have been major developments in a number of national geological disposal programs. Significant experience has been obtained both in preparing and reviewing cases for the operational and long-term safety of proposed and operating repositories. Disposal of radioactive waste is a complex issue, not only because of the nature of the waste, but also because of the detailed regulatory structure for dealing with radioactive waste, the variety of stakeholders involved, and (in some cases) the number of regulatory entities involved.« less

  19. Yucca Mountain biological resources monitoring program; Annual report FY92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-02-01

    The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize Yucca Mountain as a potential site for a geologic repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a potential repository. To ensure that site characterization activities (SCA) do not adversely affect the environment at Yucca Mountain, an environmental program has been implemented to monitor and mitigate potential impacts and ensure activities comply with applicable environmentalmore » regulations. This report describes the activities and accomplishments of EG&G Energy Measurements, Inc. (EG&G/EM) during fiscal year 1992 (FY92) for six program areas within the Terrestrial Ecosystem component of the YMP environmental program. The six program areas are Site Characterization Effects, Desert Tortoises, Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less

  20. In situ clay formation : evaluation of a proposed new technology for stable containment barriers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagy, Kathryn L.; DiGiovanni, Anthony Albert; Fredrich, Joanne T.

    2004-03-01

    Containment of chemical wastes in near-surface and repository environments is accomplished by designing engineered barriers to fluid flow. Containment barrier technologies such as clay liners, soil/bentonite slurry walls, soil/plastic walls, artificially grouted sediments and soils, and colloidal gelling materials are intended to stop fluid transport and prevent plume migration. However, despite their effectiveness in the short-term, all of these barriers exhibit geochemical or geomechanical instability over the long-term resulting in degradation of the barrier and its ability to contain waste. No technologically practical or economically affordable technologies or methods exist at present for accomplishing total remediation, contaminant removal, or destruction-degradationmore » in situ. A new type of containment barrier with a potentially broad range of environmental stability and longevity could result in significant cost-savings. This report documents a research program designed to establish the viability of a proposed new type of containment barrier derived from in situ precipitation of clays in the pore space of contaminated soils or sediments. The concept builds upon technologies that exist for colloidal or gel stabilization. Clays have the advantages of being geologically compatible with the near-surface environment and naturally sorptive for a range of contaminants, and further, the precipitation of clays could result in reduced permeability and hydraulic conductivity, and increased mechanical stability through cementation of soil particles. While limited success was achieved under certain controlled laboratory conditions, the results did not warrant continuation to the field stage for multiple reasons, and the research program was thus concluded with Phase 2.« less

  1. Status of Progress Made Toward Safety Analysis and Technical Site Evaluations for DOE Managed HLW and SNF.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David; Stein, Emily; Gross, Michael B

    The Spent Fuel and Waste Science and Technology (SFWST) Campaign of the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) is conducting research and development (R&D) on generic deep geologic disposal systems (i.e., repositories). This report describes specific activities in FY 2016 associated with the development of a Defense Waste Repository (DWR)a for the permanent disposal of a portion of the HLW and SNF derived from national defense and research and development (R&D) activities of the DOE.

  2. Testing of candidate waste-package backfill and canister materials for basalt

    NASA Astrophysics Data System (ADS)

    Wood, M. I.; Anderson, W. J.; Aden, G. D.

    1982-09-01

    The Basalt Waste Isolation Project (BWIP) is developing a multiple-barrier waste package to contain high-level nuclear waste as part of an overall system (e.g., waste package, repository sealing system, and host rock) designed to isolate the waste in a repository located in basalt beneath the Hanford Site, Richland, Washington. The three basic components of the waste package are the waste form, the canister, and the backfill. An extensive testing program is under way to determine the chemical, physical, and mechanical properties of potential canister and backfill materials. The data derived from this testing program will be used to recommend those materials that most adequately perform the functions assigned to the canister and backfill.

  3. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  4. NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2003-04-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less

  5. NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM PHASE III: IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2002-10-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has commenced on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2002, including the discontinuation of the use of Java in future Microsoft operating systems. Discussions have been held regarding establishing potential new public data repositories, with hope for final determination in 2002.« less

  6. Radioactive waste isolation in salt: special advisory report on the status of the Office of Nuclear Waste Isolation's plans for repository performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.

    1983-10-01

    Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less

  7. OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.

    PubMed

    Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier

    2016-04-01

    To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. ACToR: Aggregated Computational Toxicology Resource (T) ...

    EPA Pesticide Factsheets

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col

  9. Chemical Technology Division, Annual technical report, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-03-01

    Highlights of the Chemical Technology (CMT) Division's activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less

  10. Chemical Technology Division, Annual technical report, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-03-01

    Highlights of the Chemical Technology (CMT) Division`s activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less

  11. National Aeronautics and Space Administration Biological Specimen Repository

    NASA Technical Reports Server (NTRS)

    McMonigal, Kathleen A.; Pietrzyk, Robert a.; Johnson, Mary Anne

    2008-01-01

    The National Aeronautics and Space Administration Biological Specimen Repository (Repository) is a storage bank that is used to maintain biological specimens over extended periods of time and under well-controlled conditions. Samples from the International Space Station (ISS), including blood and urine, will be collected, processed and archived during the preflight, inflight and postflight phases of ISS missions. This investigation has been developed to archive biosamples for use as a resource for future space flight related research. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can study space flight related changes and investigate physiological markers. The development of the National Aeronautics and Space Administration Biological Specimen Repository will allow for the collection, processing, storage, maintenance, and ethical distribution of biosamples to meet goals of scientific and programmatic relevance to the space program. Archiving of the biosamples will provide future research opportunities including investigating patterns of physiological changes, analysis of components unknown at this time or analyses performed by new methodologies.

  12. Interactive Visualization of Large-Scale Hydrological Data using Emerging Technologies in Web Systems and Parallel Programming

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2013-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.

  13. New developments in measurements technology relevant to the studies of deep geological repositories in bedded salt

    NASA Astrophysics Data System (ADS)

    Mao, N. H.; Ramirez, A. L.

    1980-10-01

    Developments in measurement technology are presented which are relevant to the studies of deep geological repositories for nuclear waste disposal during all phases of development, i.e., site selection, site characterization, construction, operation, and decommission. Emphasis was placed on geophysics and geotechnics with special attention to those techniques applicable to bedded salt. The techniques are grouped into sections as follows: tectonic environment, state of stress, subsurface structures, fractures, stress changes, deformation, thermal properties, fluid transport properties, and other approaches. Several areas that merit further research and developments are identified. These areas are: in situ thermal measurement techniques, fracture detection and characterization, in situ stress measurements, and creep behavior. The available instrumentations should generally be improved to have better resolution and accuracy, enhanced instrument survivability, and reliability for extended time periods in a hostile environment.

  14. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  15. Partnerships against Violence: Promising Programs. Volume 1: Resource Guide.

    ERIC Educational Resources Information Center

    Department of Housing and Urban Development, Washington, DC.

    This volume represents the first step in an effort to build a central repository of promising anti-violence programs. Part of a cooperative venture in the federal government, this resource guide draws on information stored in more than 30 Federal clearinghouses and resource centers. Included here are programs developed by government agencies,…

  16. Towards an Integrated Framework for Designing Effective ICT-Supported Learning Environments: The Challenge to Better Link Technology and Pedagogy

    ERIC Educational Resources Information Center

    Richards, Cameron

    2006-01-01

    For various reasons many teachers struggle to harness the powerful informational, communicative and interactive learning possibilities of information and communication technologies (ICTs) in general. This is perhaps typified by how e-learning platforms and web portals are often used mainly as repositories for content and related online discussion…

  17. Criteria for the evaluation and certification of long-term digital archives in the earth sciences

    NASA Astrophysics Data System (ADS)

    Klump, Jens

    2010-05-01

    Digital information has become an indispensable part of our cultural and scientific heritage. Scientific findings, historical documents and cultural achievements are to a rapidly increasing extent being presented in electronic form - in many cases exclusively so. However, besides the invaluable advantages offered by this form, it also carries a serious disadvantage: users need to invest a great deal of technical effort in accessing the information. Also, the underlying technology is still undergoing further development at an exceptionally fast pace. The rapid obsolescence of the technology required to read the information combined with the frequently imperceptible physical decay of the media themselves represents a serious threat to preservation of the information content. Many data sets in earth science research are from observations that cannot be repeated. This makes these digital assets particularly valuable. Therefore, these data should be kept and made available for re-use long after the end of the project from which they originated. Since research projects only run for a relatively short period of time, it is advisable to shift the burden of responsibility for long-term data curation from the individual researcher to a trusted data repository or archive. But what makes a trusted data repository? Each trusted digital repository has its own targets and specifications. The trustworthiness of digital repositories can be tested and assessed on the basis of a criteria catalogue. This is the main focus of the work of the nestor working group "Trusted repositories - Certification". It identifies criteria which permit the trustworthiness of a digital repository to be evaluated, both at the organisational and technical levels. The criteria are defined in close collaboration with a wide range of different memory organisations, producers of information, experts and other interested parties. This open approach ensures a high degree of universal validity, suitability for daily practical use and also broad-based acceptance of the results. The criteria catalogue is also intended to present the option of documenting trustworthiness by means of certification in a standardised national or international process. The criteria catalogue is based on the Reference Model for an Open Archival Information System (OAIS, ISO 14721:2003) With its broad approach, the nestor criteria catalogue for trusted digital repositories has to remain on a high level of abstraction. For application in the earth sciences the evaluation criteria need to be transferred into the context of earth science data and their designated user community. This presentation offers a brief introduction to the problems surrounding the long-term preservation of digital objects. This introduction is followed by a proposed application of the criteria catalogue for trusted digital repositories to the context of earth science data and their long-term preservation.

  18. Basic repository source term and data sheet report: Lavender Canyon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs.

  19. Configuration management plan. System definition and project development. Repository Based Software Engineering (RBSE) program

    NASA Technical Reports Server (NTRS)

    Mckay, Charles

    1991-01-01

    This is the configuration management Plan for the AdaNet Repository Based Software Engineering (RBSE) contract. This document establishes the requirements and activities needed to ensure that the products developed for the AdaNet RBSE contract are accurately identified, that proposed changes to the product are systematically evaluated and controlled, that the status of all change activity is known at all times, and that the product achieves its functional performance requirements and is accurately documented.

  20. 2016 Annual Technology Baseline (ATB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values usingmore » best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.« less

  1. Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment

    NASA Astrophysics Data System (ADS)

    Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.

    2017-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.SAND2017-8198A.

  2. The United States Antarctic Program Data Center (USAP-DC): Recent Developments

    NASA Astrophysics Data System (ADS)

    Nitsche, F. O.; Bauer, R.; Arko, R. A.; Shane, N.; Carbotte, S. M.; Scambos, T.

    2017-12-01

    Antarctic earth and environmental science data are highly valuable, often unique research assets. They are acquired with substantial and expensive logistical effort, frequently in areas that will not be re-visited for many years. The data acquired in support of Antarctic research span a wide range of disciplines. Historically, data management for the US Antarctic Program (USAP) has made use of existing disciplinary data centers, and the international Antarctic Master Directory (AMD) has served as a central metadata catalog linking to data files hosted in these external repositories. However, disciplinary repositories do not exist for all USAP-generated data types and often it is unclear what repositories are appropriate, leading to many datasets being served locally from scientist's websites or not available at all. The USAP Data Center (USAP-DC; www.usap-dc.org), operated as part of the Interdisciplinary Earth Data Alliance (IEDA), contributes to the broader preservation of research data acquired with funding from NSF's Office of Polar Programs by providing a repository for diverse data from the Antarctic region. USAP-DC hosts data that spans the range of Antarctic research from snow radar to volcano observatory imagery to penguin counts to meteorological model outputs. Data services include data documentation, long-term preservation, and web publication, as well as scientist support for registration of data descriptions into the AMD in fulfillment of US obligations under the International Antarctic Treaty. In Spring 2016, USAP-DC and the NSIDC began a new collaboration to consolidate data services for Antarctic investigators and to integrate the NSF-funded glaciology collection at NSIDC with the collection hosted by USAP-DC. Investigator submissions for NSF's Glaciology program now make use of USAP-DC's web submission tools, providing a uniform interface for Antarctic investigators. The tools have been redesigned to collect a broader range of metadata. Each data submission is reviewed and verified by a specialist from the USAP-DC/NSIDC team depending on disciplinary focus of the submission. A recently updated web search interface is available to search data by title, NSF program, award, dataset contributor, large scale project (e.g. WAIS Divide Ice Core) or by specifying an area in map view.

  3. Yucca Mountain Biological Resources Monitoring Program. Progress report, January 1994--December 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-07-01

    The US Department of Energy (DOE) is required by the Nuclear Waste Policy Act of 1982 (as amended in 1987) to study and characterize the suitability of Yucca Mountain as a potential geological repository for high-level nuclear waste. During site characterization, the DOE will conduct a variety of geotechnical, geochemical, geological, and hydrological studies to determine the suitability of Yucca Mountain as a potential repository. To ensure that site characterization activities do not adversely affect the environment at Yucca Mountain, a program has been implemented to monitor and mitigate potential impacts and ensure activities comply with applicable environmental regulations. Thismore » report describes the activities and accomplishments of EG and G Energy Measurements, Inc. (EG and G/EM) from January 1994 through December 1994 for six program areas within the Terrestrial Ecosystem component of the environmental program for the Yucca Mountain Site Characterization Project (YMP): Site Characterization Effects, Desert Tortoises (Gopherus agassizii), Habitat Reclamation, Monitoring and Mitigation, Radiological Monitoring, and Biological Support.« less

  4. Open Access to Physics and Astronomy Theses: A Case Study of the Raman Research Institute Digital Repository

    NASA Astrophysics Data System (ADS)

    Nagaraj, M. N.; Manjunath, M.; Savanur, K. P.; Sheshadri, G.

    2010-10-01

    With the introduction of information technology (IT) and its applications, libraries have started looking for ways to promote their institutes' research output. At the Raman Research Institute (RRI), we have showcased research output such as research papers, newspaper clippings, annual reports, technical reports, and the entire collection of C.V. Raman through the RRI digital repository, using DSpace. Recently, we have added doctoral dissertations to the repository and have made them accessible with the author's permission. In this paper, we describe the challenges and problems encountered in this project. The various stages including policy decisions, the scanning process, getting permissions, metadata standards and other related issues are described. We conclude by making a plea to other institutions also to make their theses available open-access so that this valuable information resource is accessible to all.

  5. An Infrastructure for Indexing and Organizing Best Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Staples, Mark; Gorton, Ian

    Industry best practices are widely held but not necessarily empirically verified software engineering beliefs. Best practices can be documented in distributed web-based public repositories as pattern catalogues or practice libraries. There is a need to systematically index and organize these practices to enable their better practical use and scientific evaluation. In this paper, we propose a semi-automatic approach to index and organise best practices. A central repository acts as an information overlay on top of other pre-existing resources to facilitate organization, navigation, annotation and meta-analysis while maintaining synchronization with those resources. An initial population of the central repository is automatedmore » using Yahoo! contextual search services. The collected data is organized using semantic web technologies so that the data can be more easily shared and used for innovative analyses. A prototype has demonstrated the capability of the approach.« less

  6. Secure remote access to a clinical data repository using a wireless personal digital assistant (PDA).

    PubMed

    Duncan, R G; Shabot, M M

    2000-01-01

    TCP/IP and World-Wide-Web (WWW) technology have become the universal standards for networking and delivery of information. Personal digital assistants (PDAs), cellular telephones, and alphanumeric pagers are rapidly converging on a single pocket device that will leverage wireless TCP/IP networks and WWW protocols and can be used to deliver clinical information and alerts anytime, anywhere. We describe a wireless interface to clinical information for physicians based on Palm Corp.'s Palm VII pocket computer, a wireless digital network, encrypted data transmission, secure web servers, and a clinical data repository (CDR).

  7. Secure remote access to a clinical data repository using a wireless personal digital assistant (PDA).

    PubMed Central

    Duncan, R. G.; Shabot, M. M.

    2000-01-01

    TCP/IP and World-Wide-Web (WWW) technology have become the universal standards for networking and delivery of information. Personal digital assistants (PDAs), cellular telephones, and alphanumeric pagers are rapidly converging on a single pocket device that will leverage wireless TCP/IP networks and WWW protocols and can be used to deliver clinical information and alerts anytime, anywhere. We describe a wireless interface to clinical information for physicians based on Palm Corp.'s Palm VII pocket computer, a wireless digital network, encrypted data transmission, secure web servers, and a clinical data repository (CDR). PMID:11079875

  8. Introduction to geospatial semantics and technology workshop handbook

    USGS Publications Warehouse

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  9. Building a semantic web-based metadata repository for facilitating detailed clinical modeling in cancer genome studies.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian

    2017-06-05

    Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.

  10. The Royal Society of Chemistry and the delivery of chemistry data repositories for the community.

    PubMed

    Williams, Antony; Tkachenko, Valery

    2014-10-01

    Since 2009 the Royal Society of Chemistry (RSC) has been delivering access to chemistry data and cheminformatics tools via the ChemSpider database and has garnered a significant community following in terms of usage and contribution to the platform. ChemSpider has focused only on those chemical entities that can be represented as molecular connection tables or, to be more specific, the ability to generate an InChI from the input structure. As a structure centric hub ChemSpider is built around the molecular structure with other data and links being associated with this structure. As a result the platform has been limited in terms of the types of data that can be managed, and the flexibility of its searches, and it is constrained by the data model. New technologies and approaches, specifically taking into account a shift from relational to NoSQL databases, and the growing importance of the semantic web, has motivated RSC to rearchitect and create a more generic data repository utilizing these new technologies. This article will provide an overview of our activities in delivering data sharing platforms for the chemistry community including the development of the new data repository expanding into more extensive domains of chemistry data.

  11. The Royal Society of Chemistry and the delivery of chemistry data repositories for the community

    NASA Astrophysics Data System (ADS)

    Williams, Antony; Tkachenko, Valery

    2014-10-01

    Since 2009 the Royal Society of Chemistry (RSC) has been delivering access to chemistry data and cheminformatics tools via the ChemSpider database and has garnered a significant community following in terms of usage and contribution to the platform. ChemSpider has focused only on those chemical entities that can be represented as molecular connection tables or, to be more specific, the ability to generate an InChI from the input structure. As a structure centric hub ChemSpider is built around the molecular structure with other data and links being associated with this structure. As a result the platform has been limited in terms of the types of data that can be managed, and the flexibility of its searches, and it is constrained by the data model. New technologies and approaches, specifically taking into account a shift from relational to NoSQL databases, and the growing importance of the semantic web, has motivated RSC to rearchitect and create a more generic data repository utilizing these new technologies. This article will provide an overview of our activities in delivering data sharing platforms for the chemistry community including the development of the new data repository expanding into more extensive domains of chemistry data.

  12. Improved Access to NSF Funded Ocean Research Data

    NASA Astrophysics Data System (ADS)

    Chandler, C. L.; Groman, R. C.; Kinkade, D.; Shepherd, A.; Rauch, S.; Allison, M. D.; Gegg, S. R.; Wiebe, P. H.; Glover, D. M.

    2015-12-01

    Data from NSF-funded, hypothesis-driven research comprise an essential part of the research results upon which we base our knowledge and improved understanding of the impacts of climate change. Initially funded in 2006, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) works with marine scientists to ensure that data from NSF-funded ocean research programs are fully documented and freely available for future use. BCO-DMO works in partnership with information technology professionals, other marine data repositories and national data archive centers to ensure long-term preservation of these valuable environmental research data. Data contributed to BCO-DMO by the original investigators are enhanced with sufficient discipline-specific documentation and published in a variety of standards-compliant forms designed to enable discovery and support accurate re-use.

  13. Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tillerson, J.R.; Nimick, F.B.

    1984-12-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less

  14. Architecture Knowledge for Evaluating Scalable Databases

    DTIC Science & Technology

    2015-01-16

    problems, arising from the proliferation of new data models and distributed technologies for building scalable, available data stores . Architects must...longer are relational databases the de facto standard for building data repositories. Highly distributed, scalable “ NoSQL ” databases [11] have emerged...This is especially challenging at the data storage layer. The multitude of competing NoSQL database technologies creates a complex and rapidly

  15. The MMI Semantic Framework: Rosetta Stones for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.

    2009-12-01

    Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.

  16. Using Object Storage Technology vs Vendor Neutral Archives for an Image Data Repository Infrastructure.

    PubMed

    Bialecki, Brian; Park, James; Tilkin, Mike

    2016-08-01

    The intent of this project was to use object storage and its database, which has the ability to add custom extensible metadata to an imaging object being stored within the system, to harness the power of its search capabilities, and to close the technology gap that healthcare faces. This creates a non-disruptive tool that can be used natively by both legacy systems and the healthcare systems of today which leverage more advanced storage technologies. The base infrastructure can be populated alongside current workflows without any interruption to the delivery of services. In certain use cases, this technology can be seen as a true alternative to the VNA (Vendor Neutral Archive) systems implemented by healthcare today. The scalability, security, and ability to process complex objects makes this more than just storage for image data and a commodity to be consumed by PACS (Picture Archiving and Communication System) and workstations. Object storage is a smart technology that can be leveraged to create vendor independence, standards compliance, and a data repository that can be mined for truly relevant content by adding additional context to search capabilities. This functionality can lead to efficiencies in workflow and a wealth of minable data to improve outcomes into the future.

  17. Current Status of The Romanian National Deep Geological Repository Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radu, M.; Nicolae, R.; Nicolae, D.

    2008-07-01

    Construction of a deep geological repository is a very demanding and costly task. By now, countries that have Candu reactors, have not processed the spent fuel passing to the interim storage as a preliminary step of final disposal within the nuclear fuel cycle back-end. Romania, in comparison to other nations, represents a rather small territory, with high population density, wherein the geological formation areas with radioactive waste storage potential are limited and restricted not only from the point of view of the selection criteria due to the rocks natural characteristics, but also from the point of view of their involvementmore » in social and economical activities. In the framework of the national R and D Programs, series of 'Map investigations' have been made regarding the selection and preliminary characterization of the host geological formation for the nation's spent fuel deep geological repository. The fact that Romania has many deposits of natural gas, oil, ore and geothermal water, and intensively utilizes soil and also is very forested, cause some of the apparent acceptable sites to be rejected in the subsequent analysis. Currently, according to the Law on the spent fuel and radioactive waste management, including disposal, The National Agency of Radioactive Waste is responsible and coordinates the national strategy in the field and, subsequently, further actions will be decided. The Romanian National Strategy, approved in 2004, projects the operation of a deep geological repository to begin in 2055. (authors)« less

  18. Co-production of Health enabled by next generation personal health systems.

    PubMed

    Boye, Niels

    2012-01-01

    This paper describes the theoretical principles for the establishment of a parallel and complementary modality of healthcare delivery - named Coproduction of Health (CpH). This service-model activates digital data, information, and knowledge about health, healthy choices, and the individuals' health-state and computes through personalized models context-aware communication and advice. "Lightweight technologies" (smartphones, tablets, application stores) would serve as the technology close to the end-users (citizens, patients, clients, customers), connecting them with "big data" in conventionally and non-conventionally organized data repositories. The CpH modality aims at providing synergies between professional healthcare, selfcare, informal care and provides data-fusion from several sources such as health characteristics of consumer goods, from sensors, actuators, and health related data-repositories, and turns this into "health added value" for the individual. A theoretical business model respecting healthcare values, ethics, and legal foundation is also sketched out.

  19. JavaScript Access to DICOM Network and Objects in Web Browser.

    PubMed

    Drnasin, Ivan; Grgić, Mislav; Gogić, Goran

    2017-10-01

    Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.

  20. Chemical Technology Division annual technical report, 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-05-01

    Highlights of the Chemical Technology (CMT) Division's activities during 1990 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for coal- fired magnetohydrodynamics and fluidized-bed combustion; (3) methods for recovery of energy from municipal waste and techniques for treatment of hazardous organic waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for a high-level waste repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams, concentrating plutonium solids in pyrochemical residues by aqueous biphase extraction, andmore » treating natural and process waters contaminated by volatile organic compounds; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removal of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also has a program in basic chemistry research in the areas of fluid catalysis for converting small molecules to desired products; materials chemistry for superconducting oxides and associated and ordered solutions at high temperatures; interfacial processes of importance to corrosion science, high-temperature superconductivity, and catalysis; and the geochemical processes responsible for trace-element migration within the earth's crust. The Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the scientific and engineering programs at Argonne National Laboratory (ANL). 66 refs., 69 figs., 6 tabs.« less

  1. Implementation of the Brazilian National Repository - RBMN Project - 13008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cassia Oliveira de Tello, Cledola

    2013-07-01

    Ionizing radiation in Brazil is used in electricity generation, medicine, industry, agriculture and for research and development purposes. All these activities can generate radioactive waste. At this point, in Brazil, the use of nuclear energy and radioisotopes justifies the construction of a national repository for radioactive wastes of low and intermediate-level. According to Federal Law No. 10308, Brazilian National Commission for Nuclear Energy (CNEN) is responsible for designing and constructing the intermediate and final storages for radioactive wastes. Additionally, a restriction on the construction of Angra 3 is that the repository is under construction until its operation start, attaining somemore » requirements of the Brazilian Environmental Regulator (IBAMA). Besides this NPP, in the National Energy Program is previewed the installation of four more plants, by 2030. In November 2008, CNEN launched the Project RBMN (Repository for Low and Intermediate-Level Radioactive Wastes), which aims at the implantation of a National Repository for disposal of low and intermediate-level of radiation wastes. This Project has some aspects that are unique in the Brazilian context, especially referring to the time between its construction and the end of its institutional period. This time is about 360 years, when the area will be released for unrestricted uses. It means that the Repository must be safe and secure for more than three hundred years, which is longer than half of the whole of Brazilian history. This aspect is very new for the Brazilian people, bringing a new dimension to public acceptance. Another point is this will be the first repository in South America, bringing a real challenge for the continent. The current status of the Project is summarized. (authors)« less

  2. The future of microarray technology: networking the genome search.

    PubMed

    D'Ambrosio, C; Gatta, L; Bonini, S

    2005-10-01

    In recent years microarray technology has been increasingly used in both basic and clinical research, providing substantial information for a better understanding of genome-environment interactions responsible for diseases, as well as for their diagnosis and treatment. However, in genomic research using microarray technology there are several unresolved issues, including scientific, ethical and legal issues. Networks of excellence like GA(2)LEN may represent the best approach for teaching, cost reduction, data repositories, and functional studies implementation.

  3. Site characterization progress report: Yucca Mountain, Nevada. Number 15, April 1--September 30, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-04-01

    During the second half of fiscal year 1996, activities at the Yucca Mountain Site Characterization Project (Project) supported the objectives of the revised Program Plan released this period by the Office of Civilian Radioactive Waste Management of the US Department of Energy (Department). Outlined in the revised plan is a focused, integrated program of site characterization, design, engineering, environmental, and performance assessment activities that will achieve key Program and statutory objectives. The plan will result in the development of a license application for repository construction at Yucca Mountain, if the site is found suitable. Activities this period focused on twomore » of the three near-term objectives of the revised plan: updating in 1997 the regulatory framework for determining the suitability of the site for the proposed repository concept and providing information for a 1998 viability assessment of continuing toward the licensing of a repository. The Project has also developed a new design approach that uses the advanced conceptual design published during the last reporting period as a base for developing a design that will support the viability assessment. The initial construction phase of the Thermal Testing Facility was completed and the first phase of the in situ heater tests began on schedule. In addition, phase-one construction was completed for the first of two alcoves that will provide access to the Ghost Dance fault.« less

  4. Information and image integration: project spectrum

    NASA Astrophysics Data System (ADS)

    Blaine, G. James; Jost, R. Gilbert; Martin, Lori; Weiss, David A.; Lehmann, Ron; Fritz, Kevin

    1998-07-01

    The BJC Health System (BJC) and the Washington University School of Medicine (WUSM) formed a technology alliance with industry collaborators to develop and implement an integrated, advanced clinical information system. The industry collaborators include IBM, Kodak, SBC and Motorola. The activity, called Project Spectrum, provides an integrated clinical repository for the multiple hospital facilities of the BJC. The BJC System consists of 12 acute care hospitals serving over one million patients in Missouri and Illinois. An interface engine manages transactions from each of the hospital information systems, lab systems and radiology information systems. Data is normalized to provide a consistent view for the primary care physician. Access to the clinical repository is supported by web-based server/browser technology which delivers patient data to the physician's desktop. An HL7 based messaging system coordinates the acquisition and management of radiological image data and sends image keys to the clinical data repository. Access to the clinical chart browser currently provides radiology reports, laboratory data, vital signs and transcribed medical reports. A chart metaphor provides tabs for the selection of the clinical record for review. Activation of the radiology tab facilitates a standardized view of radiology reports and provides an icon used to initiate retrieval of available radiology images. The selection of the image icon spawns an image browser plug-in and utilizes the image key from the clinical repository to access the image server for the requested image data. The Spectrum system is collecting clinical data from five hospital systems and imaging data from two hospitals. Domain specific radiology imaging systems support the acquisition and primary interpretation of radiology exams. The spectrum clinical workstations are deployed to over 200 sites utilizing local area networks and ISDN connectivity.

  5. Active and passive computed tomography mixed waste focus area final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberson, G P

    1998-08-19

    The Mixed Waste Focus Area (MWFA) Characterization Development Strategy delineates an approach to resolve technology deficiencies associated with the characterization of mixed wastes. The intent of this strategy is to ensure the availability of technologies to support the Department of Energy's (DOE) mixed waste low-level or transuranic (TRU) contaminated waste characterization management needs. To this end the MWFA has defined and coordinated characterization development programs to ensure that data and test results necessary to evaluate the utility of non-destructive assay technologies are available to meet site contact handled waste management schedules. Requirements used as technology development project benchmarks are basedmore » in the National TRU Program Quality Assurance Program Plan. These requirements include the ability to determine total bias and total measurement uncertainty. These parameters must be completely evaluated for waste types to be processed through a given nondestructive waste assay system constituting the foundation of activities undertaken in technology development projects. Once development and testing activities have been completed, Innovative Technology Summary Reports are generated to provide results and conclusions to support EM-30, -40, or -60 end user/customer technology selection. The Active and Passive Computed Tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory's (LLNL) is developing the Active and Passive Computed Tomography (A&PCT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of .their classification; low level, transuranic or provide results and conclusions to support EM-30, -40, or -60 end user/customer technology selection. The Active and Passive Computed Tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory's (LLNL) is developing the Active and Passive Computed Tomography (A&PCT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of .their classification; low level, transuranic or mixed, which contains radioactivity and hazardous organic species. The scope of our technology is to develop a non-invasive waste-drum scanner that employs the principles of computed tomography and gamma-ray spectral analysis to identify and quantify all of the detectable radioisotopes. Once this and other applicable technologies are developed, waste drums can be non- destructively and accurately characterized to satisfy repository and regulatory guidelines prior to disposal.« less

  6. Characterization of Heat-treated Clay Minerals in the Context of Nuclear Waste Disposal

    NASA Astrophysics Data System (ADS)

    Matteo, E. N.; Wang, Y.; Kruichak, J. N.; Mills, M. M.

    2015-12-01

    Clay minerals are likely candidates to aid in nuclear waste isolation due to their low permeability, favorable swelling properties, and high cation sorption capacities. Establishing the thermal limit for clay minerals in a nuclear waste repository is a potentially important component of repository design, as flexibility of the heat load within the repository can have a major impact on the selection of repository design. For example, the thermal limit plays a critical role in the time that waste packages would need to cool before being transferred to the repository. Understanding the chemical and physical changes, if any, that occur in clay minerals at various temperatures above the current thermal limit (of 100 °C) can enable decision-makers with information critical to evaluating the potential trade-offs of increasing the thermal limit within the repository. Most critical is gaining understanding of how varying thermal conditions in the repository will impact radionuclide sorption and transport in clay materials either as engineered barriers or as disposal media. A variety of repository-relevant clay minerals (illite, mixed layer illite/smectite, and montmorillonite), were heated for a range of temperatures between 100-1000 °C. These samples were characterized to determine surface area, mineralogical alteration, and cation exchange capacity (CEC). Our results show that for conditions up to 500 °C, no significant change occurs, so long as the clay mineral remains mineralogically intact. At temperatures above 500 °C, transformation of the layered silicates into silica phases leads to alteration that impacts important clay characteristics. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's Nation Nuclear Security Administration under contract DE-AC04-94AL85000. SAND Number: SAND2015-6524 A

  7. DEVELOPMENT OF THE U.S. EPA HEALTH EFFECTS RESEARCH LABORATORY FROZEN BLOOD CELL REPOSITORY PROGRAM

    EPA Science Inventory

    In previous efforts, we suggested that proper blood cell freezing and storage is necessary in longitudinal studies with reduced between tests error, for specimen sharing between laboratories and for convenient scheduling of assays. e continue to develop and upgrade programs for o...

  8. A Framework for Integrating Oceanographic Data Repositories

    NASA Astrophysics Data System (ADS)

    Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.

    2010-12-01

    Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.

  9. Communicating credibly in an incredible environment-yucca mountain public outreach tools and techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, S.R.; Muller, E.

    Open disclosure and public understanding of major issues surrounding the Yucca Mountain Project is a consistent goal for Clark County, Nevada, which represents nearly 80 percent of Nevada's total population. Recent enhancements to the County's communication methods employ emerging technology as well as traditional public relations tactics. The County's communication methods engage the public through highly visual displays, exhibits, informative and entertaining video programs, school presentations, creative print inserts, public interaction and news media. The program provides information based on the county's research studies and findings on property values, the environment, tourism, public health and safety, increased costs for emergencymore » services and the potential disproportionate effects to Native American tribes and other minority populations in the area. Multi-cultural Dialogue: Nevada, particularly southern Nevada and the Las Vegas area, has experienced explosive growth in the last decade. The fastest growing demographic group in Nevada is Hispanics (nearly 23% in Las Vegas) and Asians (approx. 8%). Clark County's Nuclear Waste's Multi-cultural Program is designed to reach residents from these emerging segments of our population. Educational video programs: While officially opposed to the project, Clark County is committed to providing Nevada residents with accurate, timely and objective information about Yucca Mountain and its potential impacts to our state. Since the actual operation of the repository, if approved by the Nuclear Regulatory Commission, is about a decade away, the program includes presentations for middle and high school students on age-appropriate topics. Work with indigenous tribes: American Indian tribes in Southern Nevada participated in an unprecedented video program presenting the unique views and perspectives of the American Indian tribes directly impacted by the proposed repository. Monitoring program: To track economic, fiscal and social changes over time, the monitoring program is comprised of indicators in several core areas, including indicators of environmental, economic, community well being, fiscal, developmental and public health and safety. Its purpose is to highlight and monitor the most meaningful indicators of performance and perception in key service areas. The monitoring program is promoted within the public outreach program to make Nevada residents aware of this important resource of information. Internet Activities: Interactive quizzes, informational postings, electronic newsletters and pod-casts draw a demographic that prefers getting information from computer sources. Lively, interesting and ethnically diverse pod-cast episodes provide access to audio shows, which can be downloaded, to MP3 players or to a standard computer. (authors)« less

  10. An Optimal Centralized Carbon Dioxide Repository for Florida, USA

    PubMed Central

    Poiencot, Brandon; Brown, Christopher

    2011-01-01

    For over a decade, the United States Department of Energy, and engineers, geologists, and scientists from all over the world have investigated the potential for reducing atmospheric carbon emissions through carbon sequestration. Numerous reports exist analyzing the potential for sequestering carbon dioxide at various sites around the globe, but none have identified the potential for a statewide system in Florida, USA. In 2005, 83% of Florida’s electrical energy was produced by natural gas, coal, or oil (e.g., fossil fuels), from power plants spread across the state. In addition, only limited research has been completed on evaluating optimal pipeline transportation networks to centralized carbon dioxide repositories. This paper describes the feasibility and preliminary locations for an optimal centralized Florida-wide carbon sequestration repository. Linear programming optimization modeling is used to plan and route an idealized pipeline network to existing Florida power plants. Further analysis of the subsurface geology in these general locations will provide insight into the suitability of the subsurface conditions and the available capacity for carbon sequestration at selected possible repository sites. The identification of the most favorable site(s) is also presented. PMID:21695024

  11. An optimal centralized carbon dioxide repository for Florida, USA.

    PubMed

    Poiencot, Brandon; Brown, Christopher

    2011-04-01

    For over a decade, the United States Department of Energy, and engineers, geologists, and scientists from all over the world have investigated the potential for reducing atmospheric carbon emissions through carbon sequestration. Numerous reports exist analyzing the potential for sequestering carbon dioxide at various sites around the globe, but none have identified the potential for a statewide system in Florida, USA. In 2005, 83% of Florida's electrical energy was produced by natural gas, coal, or oil (e.g., fossil fuels), from power plants spread across the state. In addition, only limited research has been completed on evaluating optimal pipeline transportation networks to centralized carbon dioxide repositories. This paper describes the feasibility and preliminary locations for an optimal centralized Florida-wide carbon sequestration repository. Linear programming optimization modeling is used to plan and route an idealized pipeline network to existing Florida power plants. Further analysis of the subsurface geology in these general locations will provide insight into the suitability of the subsurface conditions and the available capacity for carbon sequestration at selected possible repository sites. The identification of the most favorable site(s) is also presented.

  12. Nuclear waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-09-01

    Radioactive waste is mounting at U.S. nuclear power plants at a rate of more than 2,000 metric tons a year. Pursuant to statute and anticipating that a geologic repository would be available in 1998, the Department of Energy (DOE) entered into disposal contracts with nuclear utilities. Now, however, DOE does not expect the repository to be ready before 2010. For this reason, DOE does not want to develop a facility for monitored retrievable storage (MRS) by 1998. This book is concerned about how best to store the waste until a repository is available, congressional requesters asked GAO to review themore » alternatives of continued storage at utilities' reactor sites or transferring waste to an MRS facility, GAO assessed the likelihood of an MRSA facility operating by 1998, legal implications if DOE is not able to take delivery of wastes in 1998, propriety of using the Nuclear Waste Fund-from which DOE's waste program costs are paid-to pay utilities for on-site storage capacity added after 1998, ability of utilities to store their waste on-site until a repository is operating, and relative costs and safety of the two storage alternatives.« less

  13. Review of DOE Waste Package Program. Semiannual report, October 1984-March 1985. Volume 8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, M.S.

    1985-12-01

    A large number of technical reports on waste package component performance were reviewed over the last year in support of the NRC`s review of the Department of Energy`s (DOE`s) Environmental Assessment reports. The intent was to assess in some detail the quantity and quality of the DOE data and their relevance to the high-level waste repository site selection process. A representative selection of the reviews is presented for the salt, basalt, and tuff repository projects. Areas for future research have been outlined. 141 refs.

  14. Learning the Language of Healthcare Enabling Semantic Web Technology in CHCS

    DTIC Science & Technology

    2013-09-01

    tuples”, (subject, predicate, object), to relate data and achieve semantic interoperability . Other similar technologies exist, but their... Semantic Healthcare repository [5]. Ultimately, both of our data approaches were successful. However, our current test system is based on the CPRS demo...to extract system dependencies and workflows; to extract semantically related patient data ; and to browse patient- centric views into the system . We

  15. re3data.org - a global registry of research data repositories

    NASA Astrophysics Data System (ADS)

    Pampel, Heinz; Vierkant, Paul; Elger, Kirsten; Bertelmann, Roland; Witt, Michael; Schirmbacher, Peter; Rücknagel, Jessika; Kindling, Maxi; Scholze, Frank; Ulrich, Robert

    2016-04-01

    re3data.org - the registry of research data repositories lists over 1,400 research data repositories from all over the world making it the largest and most comprehensive online catalog of research data repositories on the web. The registry is a valuable tool for researchers, funding organizations, publishers and libraries. re3data.org provides detailed information about research data repositories, and its distinctive icons help researchers to easily identify relevant repositories for accessing and depositing data sets [1]. Funding agencies, like the European Commission [2] and research institutions like the University of Bielefeld [3] already recommend the use of re3data.org in their guidelines and policies. Several publishers and journals like Copernicus Publications, PeerJ, and Nature's Scientific Data recommend re3data.org in their editorial policies as a tool for the easy identification of appropriate data repositories to store research data. Project partners in re3data.org are the Library and Information Services department (LIS) of the GFZ German Research Centre for Geosciences, the Computer and Media Service at the Humboldt-Universität zu Berlin, the Purdue University Libraries and the KIT Library at the Karlsruhe Institute of Technology (KIT). After its fusion with the U.S. American DataBib in 2014, re3data.org continues as a service of DataCite from 2016 on. DataCite is the international organization for the registration of Digital Object Identifiers (DOI) for research data and aims to improve their citation. The poster describes the current status and the future plans of re3data.org. [1] Pampel H, et al. (2013) Making Research Data Repositories Visible: The re3data.org Registry. PLoS ONE 8(11): e78080. doi:10.1371/journal.pone.0078080. [2] European Commission (2015): Guidelines on Open Access to Scientific Publications and Research Data in Horizon 2020. Available: http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf Accessed 11 January 2016. [3] Bielefeld University (2013): Resolution on Research Data Management. Available: http://data.uni-bielefeld.de/en/resolution Accessed 11 January 2016.

  16. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  17. DoSSiER: Database of scientific simulation and experimental results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  18. Integrated web-based viewing and secure remote access to a clinical data repository and diverse clinical systems.

    PubMed

    Duncan, R G; Saperia, D; Dulbandzhyan, R; Shabot, M M; Polaschek, J X; Jones, D T

    2001-01-01

    The advent of the World-Wide-Web protocols and client-server technology has made it easy to build low-cost, user-friendly, platform-independent graphical user interfaces to health information systems and to integrate the presentation of data from multiple systems. The authors describe a Web interface for a clinical data repository (CDR) that was moved from concept to production status in less than six months using a rapid prototyping approach, multi-disciplinary development team, and off-the-shelf hardware and software. The system has since been expanded to provide an integrated display of clinical data from nearly 20 disparate information systems.

  19. E-facts: business process management in clinical data repositories.

    PubMed

    Wattanasin, Nich; Peng, Zhaoping; Raine, Christine; Mitchell, Mariah; Wang, Charles; Murphy, Shawn N

    2008-11-06

    The Partners Healthcare Research Patient Data Registry (RPDR) is a centralized data repository that gathers clinical data from various hospital systems. The RPDR allows clinical investigators to obtain aggregate numbers of patients with user-defined characteristics such as diagnoses, procedures, medications, and laboratory values. They may then obtain patient identifiers and electronic medical records with prior IRB approval. Moreover, the accurate identification and efficient population of worthwhile and quantifiable facts from doctor's report into the RPDR is a significant process. As part of our ongoing e-Fact project, this work describes a new business process management technology that helps coordinate and simplify this procedure.

  20. DoSSiER: Database of scientific simulation and experimental results

    DOE PAGES

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...

    2016-08-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  1. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  2. 10 CFR 960.4-2-3 - Rock characteristics.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... thermal, chemical, mechanical, and radiation stresses expected to be induced by repository construction... engineering measures beyond reasonably available technology for the construction, operation, and closure of..., brine migration, or other physical, chemical, or radiation-related phenomena that could be expected to...

  3. 10 CFR 960.4-2-3 - Rock characteristics.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... thermal, chemical, mechanical, and radiation stresses expected to be induced by repository construction... engineering measures beyond reasonably available technology for the construction, operation, and closure of..., brine migration, or other physical, chemical, or radiation-related phenomena that could be expected to...

  4. Connecting the pieces: Using ORCIDs to improve research impact and repositories.

    PubMed

    Baessa, Mohamed; Lery, Thibaut; Grenz, Daryl; Vijayakumar, J K

    2015-01-01

    Quantitative data are crucial in the assessment of research impact in the academic world. However, as a young university created in 2009, King Abdullah University of Science and Technology (KAUST) needs to aggregate bibliometrics from researchers coming from diverse origins, not necessarily with the proper affiliations. In this context, the University has launched an institutional repository in September 2012 with the objectives of creating a home for the intellectual outputs of KAUST researchers. Later, the university adopted the first mandated institutional open access policy in the Arab region, effective June 31, 2014. Several projects were then initiated in order to accurately identify the research being done by KAUST authors and bring it into the repository in accordance with the open access policy. Integration with ORCID has been a key element in this process and the best way to ensure data quality for researcher's scientific contributions. It included the systematic inclusion and creation, if necessary, of ORCID identifiers in the existing repository system, an institutional membership in ORCID, and the creation of dedicated integration tools. In addition and in cooperation with the Office of Research Evaluation, the Library worked at implementing a Current Research Information System (CRIS) as a standardized common resource to monitor KAUST research outputs. We will present our findings about the CRIS implementation, the ORCID API, the repository statistics as well as our approach in conducting the assessment of research impact in terms of usage by the global research community.

  5. Establishment and operation of a biorepository for molecular epidemiologic studies in Costa Rica.

    PubMed

    Cortés, Bernal; Schiffman, Mark; Herrero, Rolando; Hildesheim, Allan; Jiménez, Silvia; Shea, Katheryn; González, Paula; Porras, Carolina; Fallas, Greivin; Rodríguez, Ana Cecilia

    2010-04-01

    The Proyecto Epidemiológico Guanacaste (PEG) has conducted several large studies related to human papillomavirus (HPV) and cervical cancer in Guanacaste, Costa Rica in a long-standing collaboration with the U.S. National Cancer Institute. To improve molecular epidemiology efforts and save costs, we have gradually transferred technology to Costa Rica, culminating in state-of-the-art laboratories and a biorepository to support a phase III clinical trial investigating the efficacy of HPV 16/18 vaccine. Here, we describe the rationale and lessons learned in transferring molecular epidemiologic and biorepository technology to a developing country. At the outset of the PEG in the early 1990s, we shipped all specimens to repositories and laboratories in the United States, which created multiple problems. Since then, by intensive personal interactions between experts from the United States and Costa Rica, we have successfully transferred liquid-based cytology, HPV DNA testing and serology, chlamydia and gonorrhea testing, PCR-safe tissue processing, and viable cryopreservation. To accommodate the vaccine trial, a state-of-the-art repository opened in mid-2004. Approximately 15,000 to 50,000 samples are housed in the repository on any given day, and >500,000 specimens have been shipped, many using a custom-made dry shipper that permits exporting >20,000 specimens at a time. Quality control of shipments received by the NCI biorepository has revealed an error rate of <0.2%. Recently, the PEG repository has incorporated other activities; for example, large-scale aliquotting and long-term, cost-efficient storage of frozen specimens returned from the United States. Using Internet-based specimen tracking software has proven to be efficient even across borders. For long-standing collaborations, it makes sense to transfer the molecular epidemiology expertise toward the source of specimens. The successes of the PEG molecular epidemiology laboratories and biorepository prove that the physical and informatics infrastructures of a modern biorepository can be transferred to a resource-limited and weather-challenged region. Technology transfer is an important and feasible goal of international collaborations.

  6. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried outmore » at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)« less

  7. Unified Database Development Program. Final Report.

    ERIC Educational Resources Information Center

    Thomas, Everett L., Jr.; Deem, Robert N.

    The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…

  8. Simulator sickness research program at NASA-Ames Research Center

    NASA Technical Reports Server (NTRS)

    Mccauley, Michael E.; Cook, Anthony M.

    1987-01-01

    The simulator sickness syndrome is receiving increased attention in the simulation community. NASA-Ames Research Center has initiated a program to facilitate the exchange of information on this topic among the tri-services and other interested government organizations. The program objectives are to identify priority research issues, promote efficient research strategies, serve as a repository of information, and disseminate information to simulator users.

  9. An On-line Technology Information System (OTIS) for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Boulanger, Richard; Hogan, John A.; Rodriquez, Luis

    2003-01-01

    OTIS is an on-line communication platform designed for smooth flow of technology information between advanced life support (ALS) technology developers, researchers, system analysts, and managers. With pathways for efficient transfer of information, several improvements in the ALS Program will result. With OTIS, it will be possible to provide programmatic information for technology developers and researchers, technical information for analysts, and managerial decision support. OTIS is a platform that enables the effective research, development, and delivery of complex systems for life support. An electronic data collection form has been developed for the solid waste element, drafted by the Solid Waste Working Group. Forms for other elements (air revitalization, water recovery, food processing, biomass production and thermal control) will also be developed, based on lessons learned from the development of the solid waste form. All forms will be developed by consultation with other working groups, comprised of experts in the area of interest. Forms will be converted to an on-line data collection interface that technology developers will use to transfer information into OTIS. Funded technology developers will log in to OTIS annually to complete the element- specific forms for their technology. The type and amount of information requested expands as the technology readiness level (TRL) increases. The completed forms will feed into a regularly updated and maintained database that will store technology information and allow for database searching. To ensure confidentiality of proprietary information, security permissions will be customized for each user. Principal investigators of a project will be able to designate certain data as proprietary and only technical monitors of a task, ALS Management, and the principal investigator will have the ability to view this information. The typical OTIS user will be able to read all non-proprietary information about all projects.Interaction with the database will occur over encrypted connections, and data will be stored on the server in an encrypted form. Implementation of OTIS will initiate a community-accessible repository of technology development information. With OTIS, ALS element leads and managers will be able to carry out informed technology selection for programmatic decisions. OTIS will also allow analysts to make accurate evaluations of technology options. Additionally, the range and specificity of information solicited will help educate technology developers of program needs. With augmentation, OTIS reporting is capable of replacing the current fiscal year-end reporting process. Overall, the system will enable more informed R&TD decisions and more rapid attainment of ALS Program goals.

  10. Development and implementation of an integrated EHR for Homecare Service: a South American experience.

    PubMed

    Aguilera Díaz, Jerónimo; Arias, Antonio Eduardo; Budalich, Cintia Mabel; Benítez, Sonia Elizabeth; López, Gastón; Borbolla, Damián; Plazzotta, Fernando; Luna, Daniel; de Quirós, Fernán González Bernaldo

    2010-01-01

    This paper describes the development and implementation of a web based electronic health record for the Homecare Service program in the Hospital Italiano de Buenos Aires. It reviews the process of the integration of the new electronic health record to the hospital information system, allowing physicians to access the clinical data repository from their Pc's at home and with the capability of consulting past and present history of the patient health care, order, tests, and referrals with others professionals trough the new Electronic Health Record. We also discuss how workflow processes were changed and improved for the physicians, nurses, and administrative personnel of the Homecare Services and the educational methods used to improve acceptance and adoption of these new technologies. We also briefly describe the validation of physicians and their field work with electronic signatures.

  11. Rolling Deck to Repository (R2R): Standards and Semantics for Open Access to Research Data

    NASA Astrophysics Data System (ADS)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2015-04-01

    In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data Centers and the EU SeaDataNet project. R2R is a partner in the Ocean Data Interoperability Platform (ODIP), working to strengthen links among regional and national data systems, as well as a lead partner in the EarthCube "GeoLink" project, developing a standard set of ontology design patterns for publishing research data using Semantic Web protocols.

  12. The FaceBase Consortium: A comprehensive program to facilitate craniofacial research

    PubMed Central

    Hochheiser, Harry; Aronow, Bruce J.; Artinger, Kristin; Beaty, Terri H.; Brinkley, James F.; Chai, Yang; Clouthier, David; Cunningham, Michael L.; Dixon, Michael; Donahue, Leah Rae; Fraser, Scott E.; Hallgrimsson, Benedikt; Iwata, Junichi; Klein, Ophir; Marazita, Mary L.; Murray, Jeffrey C.; Murray, Stephen; de Villena, Fernando Pardo-Manuel; Postlethwait, John; Potter, Steven; Shapiro, Linda; Spritz, Richard; Visel, Axel; Weinberg, Seth M.; Trainor, Paul A.

    2012-01-01

    The FaceBase Consortium consists of ten interlinked research and technology projects whose goal is to generate craniofacial research data and technology for use by the research community through a central data management and integrated bioinformatics hub. Funded by the National Institute of Dental and Craniofacial Research (NIDCR) and currently focused on studying the development of the middle region of the face, the Consortium will produce comprehensive datasets of global gene expression patterns, regulatory elements and sequencing; will generate anatomical and molecular atlases; will provide human normative facial data and other phenotypes; conduct follow up studies of a completed genome-wide association study; generate independent data on the genetics of craniofacial development, build repositories of animal models and of human samples and data for community access and analysis; and will develop software tools and animal models for analyzing and functionally testing and integrating these data. The FaceBase website (http://www.facebase.org) will serve as a web home for these efforts, providing interactive tools for exploring these datasets, together with discussion forums and other services to support and foster collaboration within the craniofacial research community. PMID:21458441

  13. Role of geophysics in identifying and characterizing sites for high-level nuclear waste repositories.

    USGS Publications Warehouse

    Wynn, J.C.; Roseboom, E.H.

    1987-01-01

    Evaluation of potential high-level nuclear waste repository sites is an area where geophysical capabilities and limitations may significantly impact a major governmental program. Since there is concern that extensive exploratory drilling might degrade most potential disposal sites, geophysical methods become crucial as the only nondestructive means to examine large volumes of rock in three dimensions. Characterization of potential sites requires geophysicists to alter their usual mode of thinking: no longer are anomalies being sought, as in mineral exploration, but rather their absence. Thus the size of features that might go undetected by a particular method take on new significance. Legal and regulatory considerations that stem from this different outlook, most notably the requirements of quality assurance (necessary for any data used in support of a repository license application), are forcing changes in the manner in which geophysicists collect and document their data. -Authors

  14. Principles for Integrating Mars Analog Science, Operations, and Technology Research

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, the scientific community and NASA used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. Human factors studies (Harrison, Clearwater, & McKay 1991; Stuster 1996) have focused on the effects of isolation in extreme environments. More recently, with the advent of wireless computing, we have prototyped advanced EVA technologies for navigation, scheduling, and science data logging (Clancey 2002b; Clancey et al., in press). Combining these interests in a single expedition enables tremendous synergy and authenticity, as pioneered by Pascal Lee's Haughton-Mars Project (Lee 2001; Clancey 2000a) and the Mars Society s research stations on a crater rim on Devon Island in the High Canadian Arctic (Clancey 2000b; 2001b) and the Morrison Formation of southeast Utah (Clancey 2002a). Based on this experience, the following principles are proposed for conducting an integrated science, operations, and technology research program at analog sites: 1) Authentic work; 2) PI-based projects; 3) Unencumbered baseline studies; 4) Closed simulations; and 5) Observation and documentation. Following these principles, we have been integrating field science, operations research, and technology development at analog sites on Devon Island and in Utah over the past five years. Analytic methods include work practice simulation (Clancey 2002c; Sierhuis et a]., 2000a;b), by which the interaction of human behavior, facilities, geography, tools, and procedures are formalized in computer models. These models are then converted into the runtime EVA system we call mobile agents (Clancey 2002b; Clancey et al., in press). Furthermore, we have found that the Apollo Lunar Surface Journal (Jones, 1999) provides a vast repository or understanding astronaut and CapCom interactions, serving as a baseline for Mars operations and quickly highlighting opportunities for computer automation (Clancey, in press).

  15. ArrayWiki: an enabling technology for sharing public microarray data repositories and meta-analyses

    PubMed Central

    Stokes, Todd H; Torrance, JT; Li, Henry; Wang, May D

    2008-01-01

    Background A survey of microarray databases reveals that most of the repository contents and data models are heterogeneous (i.e., data obtained from different chip manufacturers), and that the repositories provide only basic biological keywords linking to PubMed. As a result, it is difficult to find datasets using research context or analysis parameters information beyond a few keywords. For example, to reduce the "curse-of-dimension" problem in microarray analysis, the number of samples is often increased by merging array data from different datasets. Knowing chip data parameters such as pre-processing steps (e.g., normalization, artefact removal, etc), and knowing any previous biological validation of the dataset is essential due to the heterogeneity of the data. However, most of the microarray repositories do not have meta-data information in the first place, and do not have a a mechanism to add or insert this information. Thus, there is a critical need to create "intelligent" microarray repositories that (1) enable update of meta-data with the raw array data, and (2) provide standardized archiving protocols to minimize bias from the raw data sources. Results To address the problems discussed, we have developed a community maintained system called ArrayWiki that unites disparate meta-data of microarray meta-experiments from multiple primary sources with four key features. First, ArrayWiki provides a user-friendly knowledge management interface in addition to a programmable interface using standards developed by Wikipedia. Second, ArrayWiki includes automated quality control processes (caCORRECT) and novel visualization methods (BioPNG, Gel Plots), which provide extra information about data quality unavailable in other microarray repositories. Third, it provides a user-curation capability through the familiar Wiki interface. Fourth, ArrayWiki provides users with simple text-based searches across all experiment meta-data, and exposes data to search engine crawlers (Semantic Agents) such as Google to further enhance data discovery. Conclusions Microarray data and meta information in ArrayWiki are distributed and visualized using a novel and compact data storage format, BioPNG. Also, they are open to the research community for curation, modification, and contribution. By making a small investment of time to learn the syntax and structure common to all sites running MediaWiki software, domain scientists and practioners can all contribute to make better use of microarray technologies in research and medical practices. ArrayWiki is available at . PMID:18541053

  16. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  17. Effects of Heat Generation on Nuclear Waste Disposal in Salt

    NASA Astrophysics Data System (ADS)

    Clayton, D. J.

    2008-12-01

    Disposal of nuclear waste in salt is an established technology, as evidenced by the successful operations of the Waste Isolation Pilot Plant (WIPP) since 1999. The WIPP is located in bedded salt in southeastern New Mexico and is a deep underground facility for transuranic (TRU) nuclear waste disposal. There are many advantages for placing radioactive wastes in a geologic bedded-salt environment. One desirable mechanical characteristic of salt is that it flows plastically with time ("creeps"). The rate of salt creep is a strong function of temperature and stress differences. Higher temperatures and deviatoric stresses increase the creep rate. As the salt creeps, induced fractures may be closed and eventually healed, which then effectively seals the waste in place. With a backfill of crushed salt emplaced around the waste, the salt creep can cause the crushed salt to reconsolidate and heal to a state similar to intact salt, serving as an efficient seal. Experiments in the WIPP were conducted to investigate the effects of heat generation on the important phenomena and processes in and around the repository (Munson et al. 1987; 1990; 1992a; 1992b). Brine migration towards the heaters was induced from the thermal gradient, while salt creep rates showed an exponential dependence on temperature. The project "Backfill and Material Behavior in Underground Salt Repositories, Phase II" (BAMBUS II) studied the crushed salt backfill and material behavior with heat generation at the Asse mine located near Remlingen, Germany (Bechthold et al. 2004). Increased salt creep rates and significant reconsolidation of the crushed salt were observed at the termination of the experiment. Using the data provided from both projects, exploratory modeling of the thermal-mechanical response of salt has been conducted with varying thermal loading and waste spacing. Increased thermal loading and decreased waste spacing drive the system to higher temperatures, while both factors are desired to reduce costs, as well as decrease the overall footprint of the repository. Higher temperatures increase the rate of salt creep which then effectively seals the waste quicker. Data of the thermal-mechanical response of salt at these higher temperatures is needed to further validate the exploratory modeling and provide meaningful constraints on the repository design. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000.

  18. Repository Drift Backfilling Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Londe, I.; Dubois, J.Ph.; Bauer, C.

    2008-07-01

    The 'Backfilling Demonstrator' is one of the technological demonstrators developed by ANDRA in the framework of the feasibility studies for a geological repository for high-level long-lived (HL-LL waste) within a clay formation. The demonstrator concerns the standard and supporting backfills as defined in Andra's 2005 design. The standard backfill is intended to fill up almost all drifts of the underground repository in order to limit any deformation of the rock after the degradation of the drift lining. The supporting backfill only concerns a small portion of the volume to be backfilled in order to counter the swelling pressure of themore » swelling clay contained in the sealing structures. The first objective of the demonstrator was to show the possibility of manufacturing a satisfactory backfill, in spite of the exiguity of the underground structures, and of reusing as much as possible the argillite muck. For the purpose of this experiment, the argillite muck was collected on Andra's work-site for the implementation of an underground research laboratory. Still ongoing, the second objective is to follow up the long-term evolution of the backfill. Approximately 200 m{sup 3} of compacted backfill material have been gathered in a large concrete tube simulating a repository drift. The standard backfill was manufactured exclusively with argillite. The supporting backfill was made by forming a mixture of argillite and sand. Operations were carried out mostly at Richwiller, close to Mulhouse, France. The objectives of the demonstrator were met: an application method was tested and proven satisfactory. The resulting dry densities are relatively high, although the moduli of deformation do not always reach the set goal. The selected objective for the demonstrator was a dry density corresponding to a relatively high compaction level (95% of the standard Proctor optimum [SPO]), for both pure argillite and the argillite-sand mixture. The plate-percussion compaction technique was used and proved satisfactory. The measured dry densities are higher than the 95%-SPO objective. The implementation rates remain very low due to the experimental conditions involved. The metal supply mode would need to be revised before any industrial application is contemplated. The Demonstrator Program started in August 2004 and is followed up today over the long term. With that objective in mind, sensors and a water-saturation system have been installed. (author)« less

  19. National Database for Autism Research (NDAR): Big Data Opportunities for Health Services Research and Health Technology Assessment.

    PubMed

    Payakachat, Nalin; Tilford, J Mick; Ungar, Wendy J

    2016-02-01

    The National Database for Autism Research (NDAR) is a US National Institutes of Health (NIH)-funded research data repository created by integrating heterogeneous datasets through data sharing agreements between autism researchers and the NIH. To date, NDAR is considered the largest neuroscience and genomic data repository for autism research. In addition to biomedical data, NDAR contains a large collection of clinical and behavioral assessments and health outcomes from novel interventions. Importantly, NDAR has a global unique patient identifier that can be linked to aggregated individual-level data for hypothesis generation and testing, and for replicating research findings. As such, NDAR promotes collaboration and maximizes public investment in the original data collection. As screening and diagnostic technologies as well as interventions for children with autism are expensive, health services research (HSR) and health technology assessment (HTA) are needed to generate more evidence to facilitate implementation when warranted. This article describes NDAR and explains its value to health services researchers and decision scientists interested in autism and other mental health conditions. We provide a description of the scope and structure of NDAR and illustrate how data are likely to grow over time and become available for HSR and HTA.

  20. Construction of a technique plan repository and evaluation system based on AHP group decision-making for emergency treatment and disposal in chemical pollution accidents.

    PubMed

    Shi, Shenggang; Cao, Jingcan; Feng, Li; Liang, Wenyan; Zhang, Liqiu

    2014-07-15

    The environmental pollution resulting from chemical accidents has caused increasingly serious concerns. Therefore, it is very important to be able to determine in advance the appropriate emergency treatment and disposal technology for different types of chemical accidents. However, the formulation of an emergency plan for chemical pollution accidents is considerably difficult due to the substantial uncertainty and complexity of such accidents. This paper explains how the event tree method was used to create 54 different scenarios for chemical pollution accidents, based on the polluted medium, dangerous characteristics and properties of chemicals involved. For each type of chemical accident, feasible emergency treatment and disposal technology schemes were established, considering the areas of pollution source control, pollutant non-proliferation, contaminant elimination and waste disposal. Meanwhile, in order to obtain the optimum emergency disposal technology schemes as soon as the chemical pollution accident occurs from the plan repository, the technique evaluation index system was developed based on group decision-improved analytical hierarchy process (AHP), and has been tested by using a sudden aniline pollution accident that occurred in a river in December 2012. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Investigating the Thermal Limit of Clay Minerals for Applications in Nuclear Waste Repository Design

    NASA Astrophysics Data System (ADS)

    Matteo, E. N.; Miller, A. W.; Kruichak, J.; Mills, M.; Tellez, H.; Wang, Y.

    2013-12-01

    Clay minerals are likely candidates to aid in nuclear waste isolation due to their low permeability, favorable swelling properties, and high cation sorption capacities. Establishing the thermal limit for clay minerals in a nuclear waste repository is a potentially important component of repository design, as flexibility of the heat load within the repository can have a major impact on the selection of repository design. For example, the thermal limit plays a critical role in the time that waste packages would need to cool before being transferred to the repository. Understanding the chemical and physical changes that occur in clay minerals at various temperatures above the current thermal limit (of 100 °C) can enable decision-makers with information critical to evaluating the potential trade-offs of increasing the thermal limit within the repository. Most critical is gaining understanding of how varying thermal conditions in the repository will impact radionuclide sorption and transport in clay materials either as engineered barriers or as disposal media. A variety of clays (illite, mixed layer illite/smectite, montmorillonite, and palygorskite) were heated for a range of temperatures between 100-500 °C. These samples were characterized by a variety of methods, including nitrogen adsorption, x-ray diffraction, thermogravimetric analysis, barium chloride exchange for cation exchange capacity (CEC), and iodide sorption. The nitrogen porosimetry shows that for all the clays, thermally-induced changes in BET surface area are dominated by collapse/creation of the microporosity, i.e. pore diameters < 17 angstroms. Changes in micro porosity (relative to no heat treatment) are most significant for heat treatments 300 °C and above. Alterations are also seen in the chemical properties (CEC, XRD, iodide sorption) of clays, and like pore size distribution changes, are most significant above 300 °C. Overall, the results imply that changes seen in pores size distribution correlate with cation exchange capacity and cation exchange processes. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's Nation Nuclear Security Administration under contract DE-AC04-94AL85000. SAND Number: 2013-6352A.

  2. HEPData: a repository for high energy physics data

    NASA Astrophysics Data System (ADS)

    Maguire, Eamonn; Heinrich, Lukas; Watt, Graeme

    2017-10-01

    The Durham High Energy Physics Database (HEPData) has been built up over the past four decades as a unique open-access repository for scattering data from experimental particle physics papers. It comprises data points underlying several thousand publications. Over the last two years, the HEPData software has been completely rewritten using modern computing technologies as an overlay on the Invenio v3 digital library framework. The software is open source with the new site available at https://hepdata.net now replacing the previous site at http://hepdata.cedar.ac.uk. In this write-up, we describe the development of the new site and explain some of the advantages it offers over the previous platform.

  3. Integrating diverse databases into an unified analysis framework: a Galaxy approach

    PubMed Central

    Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton

    2011-01-01

    Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983

  4. Extreme ground motions and Yucca Mountain

    USGS Publications Warehouse

    Hanks, Thomas C.; Abrahamson, Norman A.; Baker, Jack W.; Boore, David M.; Board, Mark; Brune, James N.; Cornell, C. Allin; Whitney, John W.

    2013-01-01

    Yucca Mountain is the designated site of the underground repository for the United States' high-level radioactive waste (HLW), consisting of commercial and military spent nuclear fuel, HLW derived from reprocessing of uranium and plutonium, surplus plutonium, and other nuclear-weapons materials. Yucca Mountain straddles the western boundary of the Nevada Test Site, where the United States has tested nuclear devices since the 1950s, and is situated in an arid, remote, and thinly populated region of Nevada, ~100 miles northwest of Las Vegas. Yucca Mountain was originally considered as a potential underground repository of HLW because of its thick units of unsaturated rocks, with the repository horizon being not only ~300 m above the water table but also ~300 m below the Yucca Mountain crest. The fundamental rationale for a geologic (underground) repository for HLW is to securely isolate these materials from the environment and its inhabitants to the greatest extent possible and for very long periods of time. Given the present climate conditions and what is known about the current hydrologic system and conditions around and in the mountain itself, one would anticipate that the rates of infiltration, corrosion, and transport would be very low—except for the possibility that repository integrity might be compromised by low-probability disruptive events, which include earthquakes, strong ground motion, and (or) a repository-piercing volcanic intrusion/eruption. Extreme ground motions (ExGM), as we use the phrase in this report, refer to the extremely large amplitudes of earthquake ground motion that arise at extremely low probabilities of exceedance (hazard). They first came to our attention when the 1998 probabilistic seismic hazard analysis for Yucca Mountain was extended to a hazard level of 10-8/yr (a 10-4/yr probability for a 104-year repository “lifetime”). The primary purpose of this report is to summarize the principal results of the ExGM research program as they have developed over the past 5 years; what follows will be focused on Yucca Mountain, but not restricted to it.

  5. The design and implementation of image query system based on color feature

    NASA Astrophysics Data System (ADS)

    Yao, Xu-Dong; Jia, Da-Chun; Li, Lin

    2013-07-01

    ASP.NET technology was used to construct the B/S mode image query system. The theory and technology of database design, color feature extraction from image, index and retrieval in the construction of the image repository were researched. The campus LAN and WAN environment were used to test the system. From the test results, the needs of user queries about related resources were achieved by system architecture design.

  6. The World Wide Web and Technology Transfer at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bianco, David J.

    1994-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of the WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology Opportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. During its first year on the Web, LaRC also developed several WWW-based information repositories. The Langley Technical Report Server (LTRS), a technical paper delivery system with integrated searching and retrieval, has proved to be quite popular. The NASA Technical Report Server (NTRS), an outgrowth of LTRS, provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software with the possible phase-out of NASA's COSMIC program. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people. With the completion of the LaRC reorganization, the Technology Applications Group, charged with interfacing with non-aerospace companies, opened for business with a popular home page.

  7. Shuttle Ground Operations Efficiencies/Technologies Study (SGOE/T). Volume 5: Technical Information Sheets (TIS)

    NASA Technical Reports Server (NTRS)

    Scholz, A. L.; Hart, M. T.; Lowry, D. J.

    1987-01-01

    The Technology Information Sheet was assembled in database format during Phase I. This document was designed to provide a repository for information pertaining to 144 Operations and Maintenance Instructions (OMI) controlled operations in the Orbiter Processing Facility (OPF), Vehicle Assembly Building (VAB), and PAD. It provides a way to accumulate information about required crew sizes, operations task time duration (serial and/or parallel), special Ground Support Equipment (GSE). required, and identification of a potential application of existing technology or the need for the development of a new technolgoy item.

  8. Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spinti, Jennifer; Birgenheier, Lauren; Deo, Milind

    This report summarizes the significant findings from the Clean and Secure Energy from Domestic Oil Shale and Oil Sands Resources program sponsored by the Department of Energy through the National Energy Technology Laboratory. There were four principle areas of research; Environmental, legal, and policy issues related to development of oil shale and oil sands resources; Economic and environmental assessment of domestic unconventional fuels industry; Basin-scale assessment of conventional and unconventional fuel development impacts; and Liquid fuel production by in situ thermal processing of oil shale Multiple research projects were conducted in each area and the results have been communicated viamore » sponsored conferences, conference presentations, invited talks, interviews with the media, numerous topical reports, journal publications, and a book that summarizes much of the oil shale research relating to Utah’s Uinta Basin. In addition, a repository of materials related to oil shale and oil sands has been created within the University of Utah’s Institutional Repository, including the materials generated during this research program. Below is a listing of all topical and progress reports generated by this project and submitted to the Office of Science and Technical Information (OSTI). A listing of all peer-reviewed publications generated as a result of this project is included at the end of this report; Geomechanical and Fluid Transport Properties 1 (December, 2015); Validation Results for Core-Scale Oil Shale Pyrolysis (February, 2015); and Rates and Mechanisms of Oil Shale Pyrolysis: A Chemical Structure Approach (November, 2014); Policy Issues Associated With Using Simulation to Assess Environmental Impacts (November, 2014); Policy Analysis of the Canadian Oil Sands Experience (September, 2013); V-UQ of Generation 1 Simulator with AMSO Experimental Data (August, 2013); Lands with Wilderness Characteristics, Resource Management Plan Constraints, and Land Exchanges (March, 2012); Conjunctive Surface and Groundwater Management in Utah: Implications for Oil Shale and Oil Sands Development (May, 2012); Development of CFD-Based Simulation Tools for In Situ Thermal Processing of Oil Shale/Sands (February, 2012); Core-Based Integrated Sedimentologic, Stratigraphic, and Geochemical Analysis of the Oil Shale Bearing Green River Formation, Uinta Basin, Utah (April, 2011); Atomistic Modeling of Oil Shale Kerogens and Asphaltenes Along with their Interactions with the Inorganic Mineral Matrix (April, 2011); Pore Scale Analysis of Oil Shale/Sands Pyrolysis (March, 2011); Land and Resource Management Issues Relevant to Deploying In-Situ Thermal Technologies (January, 2011); Policy Analysis of Produced Water Issues Associated with In-Situ Thermal Technologies (January, 2011); and Policy Analysis of Water Availability and Use Issues for Domestic Oil Shale and Oil Sands Development (March, 2010)« less

  9. Development of Hydrologic Characterization Methodology of Faults: Outline of the Project in Berkeley, California

    NASA Astrophysics Data System (ADS)

    Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.

    2009-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site characterization programs. The field investigation started in 2008, by selecting the Wildcat Fault that cut across the Laurence Berkeley National Laboratory (LBNL) site as the target. Analyses on site-specific data, surface geophysics, geological mapping and trenching have confirmed the approximate location and characteristics of the fault (see Session H48, Onishi, et al). The plan for the remaining years includes borehole investigations at LBNL, and another series of investigations in the northern part of the Wildcat Fault.

  10. Greenhouse Gas Mitigation Options Database and Tool - Data repository of GHG mitigation technologies.

    EPA Science Inventory

    Industry and electricity production facilities generate over 50 percent of greenhouse gas (GHG) emissions in the United States. There is a growing consensus among scientists that the primary cause of climate change is anthropogenic greenhouse gas (GHG) emissions. Reducing GHG emi...

  11. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  12. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE PAGES

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...

    2016-11-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  13. Hydrologic and geologic characteristics of the Yucca Mountain site relevant to the performance of a potential repository

    USGS Publications Warehouse

    Levich, R.A.; Linden, R.M.; Patterson, R.L.; Stuckless, J.S.

    2000-01-01

    Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program. The first day focuses on the regional setting with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The field trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, an element of the hydrologic system that historically has received little attention. Discussions during the second day will compromise selected topics of Yucca Mountain geology, hydrology and geochemistry and will include the probabilistic volcanic hazard analysis and the seismicity and seismic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the results of recent hydrologic studies by the Nye County Nuclear Waste Program Office, and the relationship of the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.

  14. Building a genome database using an object-oriented approach.

    PubMed

    Barbasiewicz, Anna; Liu, Lin; Lang, B Franz; Burger, Gertraud

    2002-01-01

    GOBASE is a relational database that integrates data associated with mitochondria and chloroplasts. The most important data in GOBASE, i. e., molecular sequences and taxonomic information, are obtained from the public sequence data repository at the National Center for Biotechnology Information (NCBI), and are validated by our experts. Maintaining a curated genomic database comes with a towering labor cost, due to the shear volume of available genomic sequences and the plethora of annotation errors and omissions in records retrieved from public repositories. Here we describe our approach to increase automation of the database population process, thereby reducing manual intervention. As a first step, we used Unified Modeling Language (UML) to construct a list of potential errors. Each case was evaluated independently, and an expert solution was devised, and represented as a diagram. Subsequently, the UML diagrams were used as templates for writing object-oriented automation programs in the Java programming language.

  15. Centre for Research Infrastructure of Polish GNSS Data - response and possible contribution to EPOS

    NASA Astrophysics Data System (ADS)

    Araszkiewicz, Andrzej; Rohm, Witold; Bosy, Jaroslaw; Szolucha, Marcin; Kaplon, Jan; Kroszczynski, Krzysztof

    2017-04-01

    In the frame of the first call under Action 4.2: Development of modern research infrastructure of the science sector in the Smart Growth Operational Programme 2014-2020 in the late of 2016 the "EPOS-PL" project has launched. Following institutes are responsible for the implementation of this project: Institute of Geophysics, Polish Academy of Sciences - Project Leader, Academic Computer Centre Cyfronet AGH University of Science and Technology, Central Mining Institute, the Institute of Geodesy and Cartography, Wrocław University of Environmental and Life Sciences, Military University of Technology. In addition, resources constituting entrepreneur's own contribution will come from the Polish Mining Group. Research Infrastructure EPOS-PL will integrate both existing and newly built National Research Infrastructures (Theme Centre for Research Infrastructures), which, under the premise of the program EPOS, are financed exclusively by the national founds. In addition, the e-science platform will be developed. The Centre for Research Infrastructure of GNSS Data (CIBDG - Task 5) will be built based on the experience and facilities of two institutions: Military University of Technology and Wrocław University of Environmental and Life Sciences. The project includes the construction of the National GNNS Repository with data QC procedures and adaptation of two Regional GNNS Analysis Centres for rapid and long-term geodynamical monitoring.

  16. Specification for the U.S. Geological Survey Historical Topographic Map Collection

    USGS Publications Warehouse

    Allord, Gregory J.; Walter, Jennifer L.; Fishburn, Kristin A.; Shea, Gale A.

    2014-01-01

    This document provides the detailed requirements for producing, archiving, and disseminating a comprehensive digital collection of topographic maps for the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. The HTMC provides ready access to maps that are no longer available for distribution in print. A digital file representing the original paper historical topographic map is produced for each historical map in the HTMC in georeferenced PDF (GeoPDF) format (a portable document format [PDF] with a geospatial extension).

  17. Electrical Resistance Tomography to Monitor Mitigation of Metal-Toxic Acid-Leachates Ruby Gulch Waste Rock Repository Gilt Edge Mine Superfund Site, South Dakota USA

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Heath, G.; Richardson, A.; Paul, D.; Wangerud, K.

    2003-12-01

    At a cyanide heap-leach open-pit mine, 15-million cubic yards of acid-generating sulfides were dumped at the head of a steep-walled mountain valley, with 30 inches/year precipitation generating 60- gallons/minute ARD leachate. Remediation has reshaped the dump to a 70-acre, 3.5:1-sloped geometry, installed drainage benches and runoff diversions, and capped the repository and lined diversions with a polyethylene geomembrane and cover system. Monitoring was needed to evaluate (a) long-term geomembrane integrity, (b) diversion liner integrity and long-term effectiveness, (c) ARD geochemistry, kinetics and pore-gas dynamics within the repository mass, and (d) groundwater interactions. Observation wells were paired with a 600-electrode resistivity survey system. Using near-surface and down-hole electrodes and automated data collection and post-processing, periodic two- and three-dimensional resistivity images are developed to reflect current and changed-conditions in moisture, temperature, geochemical components, and flow-direction analysis. Examination of total resistivity values and time variances between images allows direct observation of liner and cap integrity with precise identification and location of leaks; likewise, if runoff migrates from degraded diversion ditches into the repository zone, there is an accompanying and noticeable change in resistivity values. Used in combination with monitoring wells containing borehole resistivity electrodes (calibrated with direct sampling of dump water/moisture, temperature and pore-gas composition), the resistivity arrays allow at-depth imaging of geochemical conditions within the repository mass. The information provides early indications of progress or deficiencies in de-watering and ARD- mitigation that is the remedy intent. If emerging technologies present opportunities for secondary treatment, deep resistivity images may assist in developing application methods and evaluating the effectiveness of any reagents introduced into the repository mass to further effect changes in oxidation/reduction reactions.

  18. The Cancer Imaging Archive (TCIA) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    TCIA is NCI’s repository for publicly shared cancer imaging data. TCIA collections include radiology and pathology images, clinical and clinical trial data, image derived annotations and quantitative features and a growing collection of related ‘omics data both from clinical and pre-clinical studies.

  19. Measuring Trust: Standards for Trusted Digital Repositories

    ERIC Educational Resources Information Center

    Dryden, Jean

    2011-01-01

    Ensuring the long-term preservation and use of born-digital and digitized records of enduring value has preoccupied archivists and their cultural heritage siblings for several decades. The professional literature of the 1980s and 1990s bemoans the challenges posed by rapid technological change (and the concomitant obsolescence of hardware and…

  20. The Process of Designing for Learning: Understanding University Teachers' Design Work

    ERIC Educational Resources Information Center

    Bennett, Sue; Agostinho, Shirley; Lockyer, Lori

    2017-01-01

    Interest in how to support the design work of university teachers has led to research and development initiatives that include technology-based design-support tools, online repositories, and technical specifications. Despite these initiatives, remarkably little is known about the design work that university teachers actually do. This paper…

  1. Assessing the Application of Three-Dimensional Collaborative Technologies within an E-Learning Environment

    ERIC Educational Resources Information Center

    McArdle, Gavin; Bertolotto, Michela

    2012-01-01

    Today, the Internet plays a major role in distributing learning material within third level education. Multiple online facilities provide access to educational resources. While early systems relied on webpages, which acted as repositories for learning material, nowadays sophisticated online applications manage and deliver learning resources.…

  2. JiFUNzeni: A Blended Learning Approach for Sustainable Teachers' Professional Development

    ERIC Educational Resources Information Center

    Onguko, Brown Bully

    2014-01-01

    JiFUNzeni blended learning approach is a sustainable approach to provision of professional development (PD) for those in challenging educational contexts. JiFUNzeni approach emphasizes training regional experts to create blended learning content, working with appropriate technology while building content repositories. JiFUNzeni approach was…

  3. Utilizing the Antarctic Master Directory to find orphan datasets

    NASA Astrophysics Data System (ADS)

    Bonczkowski, J.; Carbotte, S. M.; Arko, R. A.; Grebas, S. K.

    2011-12-01

    While most Antarctic data are housed at an established disciplinary-specific data repository, there are data types for which no suitable repository exists. In some cases, these "orphan" data, without an appropriate national archive, are served from local servers by the principal investigators who produced the data. There are many pitfalls with data served privately, including the frequent lack of adequate documentation to ensure the data can be understood by others for re-use and the impermanence of personal web sites. For example, if an investigator leaves an institution and the data moves, the link published is no longer accessible. To ensure continued availability of data, submission to long-term national data repositories is needed. As stated in the National Science Foundation Office of Polar Programs (NSF/OPP) Guidelines and Award Conditions for Scientific Data, investigators are obligated to submit their data for curation and long-term preservation; this includes the registration of a dataset description into the Antarctic Master Directory (AMD), http://gcmd.nasa.gov/Data/portals/amd/. The AMD is a Web-based, searchable directory of thousands of dataset descriptions, known as DIF records, submitted by scientists from over 20 countries. It serves as a node of the International Directory Network/Global Change Master Directory (IDN/GCMD). The US Antarctic Program Data Coordination Center (USAP-DCC), http://www.usap-data.org/, funded through NSF/OPP, was established in 2007 to help streamline the process of data submission and DIF record creation. When data does not quite fit within any existing disciplinary repository, it can be registered within the USAP-DCC as the fallback data repository. Within the scope of the USAP-DCC we undertook the challenge of discovering and "rescuing" orphan datasets currently registered within the AMD. In order to find which DIF records led to data served privately, all records relating to US data within the AMD were parsed. After identifying the records containing a URL leading to a national data center or other disciplinary data repository, the remaining records were individually inspected for data type, format, and quality of metadata and then assessed to determine how best to preserve. Of the records reviewed, those for which appropriate repositories could be identified were submitted. An additional 35 were deemed acceptable in quality of metadata to register in the USAP-DCC. The content of these datasets were varied in nature, ranging from penguin counts to paleo-geologic maps to results of meteorological models all of which are discoverable through our search interface, http://www.usap-data.org/search.php. The remaining 40 records linked to either no data or had inadequate documentation for preservation highlighting the danger of serving datasets on local servers where minimal metadata standards can not be enforced and long-term access can not be ensured.

  4. Information persistence using XML database technology

    NASA Astrophysics Data System (ADS)

    Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.

    2005-05-01

    The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and improved retrieval techniques.

  5. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  6. Field of genes: using Apache Kafka as a bioinformatic data repository.

    PubMed

    Lawlor, Brendan; Lynch, Richard; Mac Aogáin, Micheál; Walsh, Paul

    2018-04-01

    Bioinformatic research is increasingly dependent on large-scale datasets, accessed either from private or public repositories. An example of a public repository is National Center for Biotechnology Information's (NCBI's) Reference Sequence (RefSeq). These repositories must decide in what form to make their data available. Unstructured data can be put to almost any use but are limited in how access to them can be scaled. Highly structured data offer improved performance for specific algorithms but limit the wider usefulness of the data. We present an alternative: lightly structured data stored in Apache Kafka in a way that is amenable to parallel access and streamed processing, including subsequent transformations into more highly structured representations. We contend that this approach could provide a flexible and powerful nexus of bioinformatic data, bridging the gap between low structure on one hand, and high performance and scale on the other. To demonstrate this, we present a proof-of-concept version of NCBI's RefSeq database using this technology. We measure the performance and scalability characteristics of this alternative with respect to flat files. The proof of concept scales almost linearly as more compute nodes are added, outperforming the standard approach using files. Apache Kafka merits consideration as a fast and more scalable but general-purpose way to store and retrieve bioinformatic data, for public, centralized reference datasets such as RefSeq and for private clinical and experimental data.

  7. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.

    PubMed

    Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu

    2006-01-11

    Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  8. Entrez Neuron RDFa: a pragmatic semantic web application for data integration in neuroscience research.

    PubMed

    Samwald, Matthias; Lim, Ernest; Masiar, Peter; Marenco, Luis; Chen, Huajun; Morse, Thomas; Mutalik, Pradeep; Shepherd, Gordon; Miller, Perry; Cheung, Kei-Hoi

    2009-01-01

    The amount of biomedical data available in Semantic Web formats has been rapidly growing in recent years. While these formats are machine-friendly, user-friendly web interfaces allowing easy querying of these data are typically lacking. We present "Entrez Neuron", a pilot neuron-centric interface that allows for keyword-based queries against a coherent repository of OWL ontologies. These ontologies describe neuronal structures, physiology, mathematical models and microscopy images. The returned query results are organized hierarchically according to brain architecture. Where possible, the application makes use of entities from the Open Biomedical Ontologies (OBO) and the 'HCLS knowledgebase' developed by the W3C Interest Group for Health Care and Life Science. It makes use of the emerging RDFa standard to embed ontology fragments and semantic annotations within its HTML-based user interface. The application and underlying ontologies demonstrate how Semantic Web technologies can be used for information integration within a curated information repository and between curated information repositories. It also demonstrates how information integration can be accomplished on the client side, through simple copying and pasting of portions of documents that contain RDFa markup.

  9. Yucca Mountain nuclear waste repository prompts heated congressional hearing

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-11-01

    Although the final report of the Blue Ribbon Commission on America's Nuclear Future is not expected until January 2012, the tentative conclusions of the commission's draft report were dissected during a recent joint hearing by two subcommittees of the House of Representatives' Committee on Science, Space, and Technology. Among the more heated issues debated at the hearing was the fate of the stalled Yucca Mountain nuclear waste repository in Nevada. The Blue Ribbon Commission's (BRC) draft report includes recommendations for managing nuclear waste and for developing one or more permanent deep geological repositories and interim storage facilities, but the report does not address the future of Yucca Mountain. The BRC charter indicates that the commission is to "conduct a comprehensive review of policies for managing the back end of the nuclear fuel cycle." However, the draft report states that the commission was not asked to consider, and therefore did not address, several key issues. "We have not rendered an opinion on the suitability of the Yucca Mountain site or on the request to withdraw the license application for Yucca Mountain," the draft report states.

  10. [The effects of various factors on the in vitro velocity of drug release from repository tablets. Part 4: Isoniazid (Rimicid) respository tablets (author's transl)].

    PubMed

    Tomassini, L; Michailova, D; Naplatanova, D; Slavtschev, P

    1979-12-01

    The authors investigated the release of isoniazid from repository tablets as related to form, processing technology, strength constant and storage for 5 years. On determining the diffusion coefficient (D), the initial dissolution rate (Vo) and the time required for the diffusion of the releasing medium to the middle of the tablet (t1/2), it was found that the difference in release rate between the flat and the biconvex tablets is small. Furthermore, it was stated that the three-layer tablets have very high D and Vo values and very low t1/2 values, for what reason they are unsuited for repository tablets of the composition under investigation. Moreover, it was found that an increase of the strength constant does not affect the D, t1/2 and Vo values, and that the release of isoniazid is retarded only in flat tablets with the highest strength constant. Storage exerts no effect on the drug release from these tablets. The industrial production of these tablets is under way.

  11. Scoping review and evaluation of SMS/text messaging platforms for mHealth projects or clinical interventions.

    PubMed

    Iribarren, Sarah J; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex

    2017-05-01

    Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and integrate prior user experiences as researchers; 3) evaluate each platform's functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay structures and reported targeted end-users varied. Peer-reviewed publications listed only 6 of the 27 platforms across 21 publications. The majority of these articles reported the name of the platform used but did not describe advantages or disadvantages. Searching for and comparing mHealth platforms for TMIs remains a challenge. The results of this review can serve as a resource for researchers and healthcare professionals wanting to integrate TMIs into health interventions. Steps to identify, compare and assess advantages and disadvantages are outlined for consideration. Expanded evaluation criteria can be used by future researchers. Continued and more comprehensive platform tools should be integrated into mHealth repositories. Detailed descriptions of platform advantages and disadvantages are needed when mHealth researchers publish findings to expand the body of research on TMI tools for healthcare. Standardized descriptions and features are recommended for vendor sites. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Scoping Review and Evaluation of SMS/text Messaging Platforms for mHealth Projects or Clinical Interventions

    PubMed Central

    Iribarren, Sarah; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex

    2017-01-01

    Objectives Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. Methods A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and prior user experiences as researchers; 3) evaluate each platform’s functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Results Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay structures and reported targeted end-users varied. Peer-reviewed publications listed only 6 of the 27 platforms across 21 publications. The majority of these articles reported the name of the platform used but did not describe advantages or disadvantages. Conclusions Searching for and comparing mHealth platforms for TMIs remains a challenge. The results of this review can serve as a resource for researchers and healthcare professionals wanting to integrate TMIs into health interventions. Steps to identify, compare and assess advantages and disadvantages are outlined for consideration. Expanded evaluation criteria can be used by future researchers. Continued and more comprehensive platform tools should be integrated into mHealth repositories. Detailed descriptions of platform advantages and disadvantages are needed when mHealth researchers publish findings to expand the body of research on texting-based tools for healthcare. Standardized descriptions and features are recommended for vendor sites. PMID:28347445

  13. Object linking in repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Beck, Jon; Atkins, John; Bailey, Bill

    1992-01-01

    This topic is covered in three sections. The first section explores some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life cycle of software development. A model is considered that provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The second section gives a description of the efforts to implement the repository architecture using a commercially available object-oriented database management system. Some of the features of this implementation are described, and some of the next steps to be taken to produce a working prototype of the repository are pointed out. In the final section, it is argued that design and instantiation of reusable components have competing criteria (design-for-reuse strives for generality, design-with-reuse strives for specificity) and that providing mechanisms for each can be complementary rather than antagonistic. In particular, it is demonstrated how program slicing techniques can be applied to customization of reusable components.

  14. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources.

    PubMed

    Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

  15. Report to Congress on the potential use of lead in the waste packages for a geologic repository at Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1989-12-01

    In the Report of the Senate Committee on Appropriations accompanying the Energy and Water Appropriation Act for 1989, the Committee directed the Department of Energy (DOE) to evaluate the use of lead in the waste packages to be used in geologic repositories for spent nuclear fuel and high-level waste. The evaluation that was performed in response to this directive is presented in this report. This evaluation was based largely on a review of the technical literature on the behavior of lead, reports of work conducted in other countries, and work performed for the waste-management program being conducted by the DOE.more » The initial evaluation was limited to the potential use of lead in the packages to be used in the repository. Also, the focus of this report is post closure performance and not on retrievability and handling aspects of the waste package. 100 refs., 8 figs., 15 tabs.« less

  16. The United States Polar Rock Repository: A geological resource for the Earth science community

    USGS Publications Warehouse

    Grunow, Annie M.; Elliot, David H.; Codispoti, Julie E.

    2007-01-01

    The United States Polar Rock Repository (USPRR) is a U. S. national facility designed for the permanent curatorial preservation of rock samples, along with associated materials such as field notes, annotated air photos and maps, raw analytic data, paleomagnetic cores, ground rock and mineral residues, thin sections, and microfossil mounts, microslides and residues from Polar areas. This facility was established by the Office of Polar Programs at the U. S. National Science Foundation (NSF) to minimize redundant sample collecting, and also because the extreme cold and hazardous field conditions make fieldwork costly and difficult. The repository provides, along with an on-line database of sample information, an essential resource for proposal preparation, pilot studies and other sample based research that should make fieldwork more efficient and effective. This latter aspect should reduce the environmental impact of conducting research in sensitive Polar Regions. The USPRR also provides samples for educational outreach. Rock samples may be borrowed for research or educational purposes as well as for museum exhibits.

  17. Perceived risk, stigma, and potential economic impacts of a high-level nuclear waste repository in Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slovic, P.; Layman, M.; Kraus, N.N.

    1989-07-01

    This paper describes a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada, upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. Adverse economic impacts may be expected to result from two related social processes. One has to do with perceptions of risk and socially amplified reactions to ``unfortunate events`` associated with the repository (major and minor accidents, discoveries of radiation releases, evidence of mismanagement, attempts to sabotage or disrupt the facility, etc.). The second process that may trigger significant adverse impacts is thatmore » of stigmatization. The conceptual underpinnings of risk perception, social amplification, and stigmatization are discussed in this paper and empirical data are presented to demonstrate how nuclear images associated with Las Vegas and the State of Nevada might trigger adverse effects on tourism, migration, and business development.« less

  18. Site characterization plan: Yucca Mountain Site, Nevada Research and Development Area, Nevada: Volume 8, Part B: Chapter 8, Sections 8.3.5 through 8.3.5.20

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1988-12-01

    This site characterization plan (SCP) has been developed for the candidate repository site at Yucca Mountain in the State of Nevada. The SCP includes a description of the Yucca Mountain site (Chapters 1-5), a conceptual design for the repository (Chapter 6), a description of the packaging to be used for the waste to be emplaced in the repository (Chapter 7), and a description of the planned site characterization activities (Chapter 8). The schedules and milestones presented in Sections 8.3 and 8.5 of the SCP were developed to be consistent with the June 1988 draft Amendment to the DOE`s Mission Planmore » for the Civilian Radioactive Waste Management Program. The five month delay in the scheduled start of exploratory shaft construction that was announced recently is not reflected in these schedules. 68 figs., 102 tabs.« less

  19. Health care transformation through collaboration on open-source informatics projects: integrating a medical applications platform, research data repository, and patient summarization.

    PubMed

    Klann, Jeffrey G; McCoy, Allison B; Wright, Adam; Wattanasin, Nich; Sittig, Dean F; Murphy, Shawn N

    2013-05-30

    The Strategic Health IT Advanced Research Projects (SHARP) program seeks to conquer well-understood challenges in medical informatics through breakthrough research. Two SHARP centers have found alignment in their methodological needs: (1) members of the National Center for Cognitive Informatics and Decision-making (NCCD) have developed knowledge bases to support problem-oriented summarizations of patient data, and (2) Substitutable Medical Apps, Reusable Technologies (SMART), which is a platform for reusable medical apps that can run on participating platforms connected to various electronic health records (EHR). Combining the work of these two centers will ensure wide dissemination of new methods for synthesized views of patient data. Informatics for Integrating Biology and the Bedside (i2b2) is an NIH-funded clinical research data repository platform in use at over 100 sites worldwide. By also working with a co-occurring initiative to SMART-enabling i2b2, we can confidently write one app that can be used extremely broadly. Our goal was to facilitate development of intuitive, problem-oriented views of the patient record using NCCD knowledge bases that would run in any EHR. To do this, we developed a collaboration between the two SHARPs and an NIH center, i2b2. First, we implemented collaborative tools to connect researchers at three institutions. Next, we developed a patient summarization app using the SMART platform and a previously validated NCCD problem-medication linkage knowledge base derived from the National Drug File-Reference Terminology (NDF-RT). Finally, to SMART-enable i2b2, we implemented two new Web service "cells" that expose the SMART application programming interface (API), and we made changes to the Web interface of i2b2 to host a "carousel" of SMART apps. We deployed our SMART-based, NDF-RT-derived patient summarization app in this SMART-i2b2 container. It displays a problem-oriented view of medications and presents a line-graph display of laboratory results. This summarization app can be run in any EHR environment that either supports SMART or runs SMART-enabled i2b2. This i2b2 "clinical bridge" demonstrates a pathway for reusable app development that does not require EHR vendors to immediately adopt the SMART API. Apps can be developed in SMART and run by clinicians in the i2b2 repository, reusing clinical data extracted from EHRs. This may encourage the adoption of SMART by supporting SMART app development until EHRs adopt the platform. It also allows a new variety of clinical SMART apps, fueled by the broad aggregation of data types available in research repositories. The app (including its knowledge base) and SMART-i2b2 are open-source and freely available for download.

  20. Evolving the Living With a Star Data System Definition

    NASA Astrophysics Data System (ADS)

    Otranto, J.; Dijoseph, M.; Worrall, W.

    2003-04-01

    NASA’s Living With a Star (LWS) Program is a space weather-focused and applications-driven research program. The LWS Program is soliciting input from the solar, space physics, space weather, and climate science communities to develop a system that enables access to science data associated with these disciplines, and advances the development of discipline and interdisciplinary findings. The LWS Program will implement a data system that builds upon the existing and planned data capture, processing, and storage components put in place by individual spacecraft missions and also inter-project data management systems, such as active archives, deep archives, and multi-mission repositories. It is technically feasible for the LWS Program to integrate data from a broad set of resources, assuming they are either publicly accessible or access is permitted by the system’s administrators. The LWS Program data system will work in coordination with spacecraft mission data systems and science data repositories, integrating them into a common data representation. This common representation relies on a robust metadata definition that provides journalistic and technical data descriptions, plus linkages to supporting data products and tools. The LWS Program intends to become an enabling resource to PIs, interdisciplinary scientists, researchers, and students facilitating both access to a broad collection of science data, as well as the necessary supporting components to understand and make productive use of the data. For the LWS Program to represent science data that is physically distributed across various ground system elements, information about the data products stored on each system is collected through a series of LWS-created active agents. These active agents are customized to interface or interact with each one of these data systems, collect information, and forward updates to a single LWS-developed metadata broker. This broker, in turn, updates a centralized repository of LWS-specific metadata. A populated LWS metadata database is a single point-of-contact that can serve all users (the science community) with a “one-stop-shop” for data access. While data may not be physically stored in an LWS-specific repository, the LWS system enables data access from wherever the data are stored. Moreover, LWS provides the user access to information for understanding the data source, format, and calibration, enables access to ancillary and correlative data products, provides links to processing tools and models associated with the data, and any corresponding findings. The LWS may also support an active archive for solar, space physics, space weather, and climate data when these data would otherwise be discarded or archived off-line. This archive could potentially serve as a backup facility for LWS missions. This plan is developed based upon input already received from the science community; the architecture is based on system developed to date that have worked well on a smaller scale. The LWS Program continues to seek constructive input from the science community, examples of both successes and failures in dealing with science data systems, and insights regarding the obstacles between the current state-of-the-practice and this vision for the LWS Program data system.

  1. Applying the institutional review board data repository approach to manage ethical considerations in evaluating and studying medical education

    PubMed Central

    Thayer, Erin K.; Rathkey, Daniel; Miller, Marissa Fuqua; Palmer, Ryan; Mejicano, George C.; Pusic, Martin; Kalet, Adina; Gillespie, Colleen; Carney, Patricia A.

    2016-01-01

    Issue Medical educators and educational researchers continue to improve their processes for managing medical student and program evaluation data using sound ethical principles. This is becoming even more important as curricular innovations are occurring across undergraduate and graduate medical education. Dissemination of findings from this work is critical, and peer-reviewed journals often require an institutional review board (IRB) determination. Approach IRB data repositories, originally designed for the longitudinal study of biological specimens, can be applied to medical education research. The benefits of such an approach include obtaining expedited review for multiple related studies within a single IRB application and allowing for more flexibility when conducting complex longitudinal studies involving large datasets from multiple data sources and/or institutions. In this paper, we inform educators and educational researchers on our analysis of the use of the IRB data repository approach to manage ethical considerations as part of best practices for amassing, pooling, and sharing data for educational research, evaluation, and improvement purposes. Implications Fostering multi-institutional studies while following sound ethical principles in the study of medical education is needed, and the IRB data repository approach has many benefits, especially for longitudinal assessment of complex multi-site data. PMID:27443407

  2. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to specific data formats or protocols. Oceanotron is deployed at seven European data centres for marine in-situ observations within myOcean. While additional extensions are still being developed, to promote new collaborative initiatives, a work is now done on continuous and distributed integration (jenkins, maven), shared reference documentation (on alfresco) and code and release dissemination (sourceforge, github).

  3. Waste Isolation Safety Assessment Program. Task 4. Third Contractor Information Meeting. [Adsorption-desorption on geological media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-06-01

    The study subject of this meeting was the adsorption and desorption of radionuclides on geologic media under repository conditions. This volume contans eight papers. Separate abstracts were prepared for all eight papers. (DLC)

  4. NCTN/NCORP Data Archive: Expanding Access to Clinical Trial Data

    Cancer.gov

    NCI is launching the NCTN/NCORP Data Archive, a centralized repository of patient-level data from phase III clinical trials conducted by NCI’s NCTN and NCORP trials programs and the National Cancer Institute of Canada-Clinical Trials Group.

  5. Dynameomics: design of a computational lab workflow and scientific data repository for protein simulations.

    PubMed

    Simms, Andrew M; Toofanny, Rudesh D; Kehl, Catherine; Benson, Noah C; Daggett, Valerie

    2008-06-01

    Dynameomics is a project to investigate and catalog the native-state dynamics and thermal unfolding pathways of representatives of all protein folds using solvated molecular dynamics simulations, as described in the preceding paper. Here we introduce the design of the molecular dynamics data warehouse, a scalable, reliable repository that houses simulation data that vastly simplifies management and access. In the succeeding paper, we describe the development of a complementary multidimensional database. A single protein unfolding or native-state simulation can take weeks to months to complete, and produces gigabytes of coordinate and analysis data. Mining information from over 3000 completed simulations is complicated and time-consuming. Even the simplest queries involve writing intricate programs that must be built from low-level file system access primitives and include significant logic to correctly locate and parse data of interest. As a result, programs to answer questions that require data from hundreds of simulations are very difficult to write. Thus, organization and access to simulation data have been major obstacles to the discovery of new knowledge in the Dynameomics project. This repository is used internally and is the foundation of the Dynameomics portal site http://www.dynameomics.org. By organizing simulation data into a scalable, manageable and accessible form, we can begin to address substantial questions that move us closer to solving biomedical and bioengineering problems.

  6. Libraries program

    USGS Publications Warehouse

    2011-01-01

    The U.S. Congress authorized a library for the U.S. Geological Survey (USGS) in 1879. The library was formally established in 1882 with the naming of the first librarian and began with a staff of three and a collection of 1,400 books. Today, the USGS Libraries Program is one of the world's largest Earth and natural science repositories and a resource of national significance used by researchers and the public worldwide.

  7. 2012 best practices for repositories collection, storage, retrieval, and distribution of biological materials for research international society for biological and environmental repositories.

    PubMed

    2012-04-01

    Third Edition [Formula: see text] [Box: see text] Printed with permission from the International Society for Biological and Environmental Repositories (ISBER) © 2011 ISBER All Rights Reserved Editor-in-Chief Lori D. Campbell, PhD Associate Editors Fay Betsou, PhD Debra Leiolani Garcia, MPA Judith G. Giri, PhD Karen E. Pitt, PhD Rebecca S. Pugh, MS Katherine C. Sexton, MBA Amy P.N. Skubitz, PhD Stella B. Somiari, PhD Individual Contributors to the Third Edition Jonas Astrin, Susan Baker, Thomas J. Barr, Erica Benson, Mark Cada, Lori Campbell, Antonio Hugo Jose Froes Marques Campos, David Carpentieri, Omoshile Clement, Domenico Coppola, Yvonne De Souza, Paul Fearn, Kelly Feil, Debra Garcia, Judith Giri, William E. Grizzle, Kathleen Groover, Keith Harding, Edward Kaercher, Joseph Kessler, Sarah Loud, Hannah Maynor, Kevin McCluskey, Kevin Meagher, Cheryl Michels, Lisa Miranda, Judy Muller-Cohn, Rolf Muller, James O'Sullivan, Karen Pitt, Rebecca Pugh, Rivka Ravid, Katherine Sexton, Ricardo Luis A. Silva, Frank Simione, Amy Skubitz, Stella Somiari, Frans van der Horst, Gavin Welch, Andy Zaayenga 2012 Best Practices for Repositories: Collection, Storage, Retrieval and Distribution of Biological Materials for Research INTERNATIONAL SOCIETY FOR BIOLOGICAL AND ENVIRONMENTAL REPOSITORIES (ISBER) INTRODUCTION T he availability of high quality biological and environmental specimens for research purposes requires the development of standardized methods for collection, long-term storage, retrieval and distribution of specimens that will enable their future use. Sharing successful strategies for accomplishing this goal is one of the driving forces for the International Society for Biological and Environmental Repositories (ISBER). For more information about ISBER see www.isber.org . ISBER's Best Practices for Repositories (Best Practices) reflect the collective experience of its members and has received broad input from other repository professionals. Throughout this document effective practices are presented for the management of specimen collections and repositories. The term "Best Practice" is used in cases where a level of operation is indicated that is above the basic recommended practice or more specifically designates the most effective practice. It is understood that repositories in certain locations or with particular financial constraints may not be able to adhere to each of the items designated as "Best Practices". Repositories fitting into either of these categories will need to decide how they might best adhere to these recommendations within their particular circumstances. While adherence to ISBER Best Practices is strictly on a voluntary basis, it is important to note that some aspects of specimen management are governed by national/federal, regional and local regulations. The reader should refer directly to regulations for their national/federal, regional and local requirements, as appropriate. ISBER has strived to include terminology appropriate to the various specimen types covered under these practices, but here too, the reader should take steps to ensure the appropriateness of the recommendations to their particular repository type prior to the implementation of any new approaches. Important terms within the document are italicized when first used in a section and defined in the glossary. The ISBER Best Practices are periodically reviewed and revised to reflect advances in research and technology. The third edition of the Best Practices builds on the foundation established in the first and second editions which were published in 2005 and 2008, respectively.

  8. The C6H6 NMR repository: An integral solution to control the flow of your data from the magnet to the public.

    PubMed

    Patiny, Luc; Zasso, Michaël; Kostro, Daniel; Bernal, Andrés; Castillo, Andrés M; Bolaños, Alejandro; Asencio, Miguel A; Pellet, Norman; Todd, Matthew; Schloerer, Nils; Kuhn, Stefan; Holmes, Elaine; Javor, Sacha; Wist, Julien

    2017-10-05

    NMR is a mature technique that is well established and adopted in a wide range of research facilities from laboratories to hospitals. This accounts for large amounts of valuable experimental data that may be readily exported into a standard and open format. Yet the publication of these data faces an important issue: Raw data are not made available; instead, the information is slimed down into a string of characters (the list of peaks). Although historical limitations of technology explain this practice, it is not acceptable in the era of Internet. The idea of modernizing the strategy for sharing NMR data is not new, and some repositories exist, but sharing raw data is still not an established practice. Here, we present a powerful toolbox built on recent technologies that runs inside the browser and provides a means to store, share, analyse, and interact with original NMR data. Stored spectra can be streamlined into the publication pipeline, to improve the revision process for instance. The set of tools is still basic but is intended to be extended. The project is open source under the Massachusetts Institute of Technology (MIT) licence. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Brave New World or "Plus Ca Change"?: Electronic Journals and the Academic Library

    ERIC Educational Resources Information Center

    Burrows, Toby

    2006-01-01

    The impact that electronic information technologies have had on scholarly communications and university libraries is assessed. Early predictions that the dominance of commercial publishers would decline and journal prices would fall have not been realised. The development of institutional repositories have had limited success in making the…

  10. IT Challenges for Space Medicine

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy

    2010-01-01

    This viewgraph presentation reviews the various Information Technology challenges for aerospace medicine. The contents include: 1) Space Medicine Activities; 2) Private Medical Information; 3) Lifetime Surveillance of Astronaut Health; 4) Mission Medical Support; 5) Data Repositories for Research; 6) Data Input and Output; 7) Finding Data/Information; 8) Summary of Challenges; and 9) Solutions and questions.

  11. Integrating a Learning Management System with a Student Assignments Digital Repository. A Case Study

    ERIC Educational Resources Information Center

    Díaz, Javier; Schiavoni, Alejandra; Osorio, María Alejandra; Amadeo, Ana Paola; Charnelli, María Emilia

    2013-01-01

    The integration of different platforms and information Systems in the academic environment is highly important and quite a challenge within the field of Information Technology. This integration allows for higher resource availability and improved interaction among intervening actors. In the field of e-Learning, where Learning Management Systems…

  12. Facilitating Teachers' Reuse of Mobile Assisted Language Learning Resources Using Educational Metadata

    ERIC Educational Resources Information Center

    Zervas, Panagiotis; Sampson, Demetrios G.

    2014-01-01

    Mobile assisted language learning (MALL) and open access repositories for language learning resources are both topics that have attracted the interest of researchers and practitioners in technology enhanced learning (TeL). Yet, there is limited experimental evidence about possible factors that can influence and potentially enhance reuse of MALL…

  13. Development of Prime Cyber-Infrastructure for Combustion

    DTIC Science & Technology

    2009-07-14

    48 6 Technologies used 6.1 PrIMe Portal The PrIMe portal is executed using the PHP language with the help of CMF Drupal -5. The standard...modules of Drupal core set are developed by third parties and obtained from the repository drupal.org. Part of the modules was modified specifically for

  14. The Electronic Studio and the Intranet: Network-Based Learning.

    ERIC Educational Resources Information Center

    Solis, Carlos R.

    The Electronic Studio, developed by the Rice University (Texas) Center for Technology in Teaching and Learning (CTTL), serves a number of purposes related to the construction and development of learning projects. It is a workplace, a display area, and a repository for tools, data, multimedia, design projects, and personal papers. This paper…

  15. Online Concept Maps: Enhancing Collaborative Learning by Using Technology with Concept Maps.

    ERIC Educational Resources Information Center

    Canas, Alberto J.; Ford, Kenneth M.; Novak, Joseph D.; Hayes, Patrick; Reichherzer, Thomas R.; Suri, Niranjan

    2001-01-01

    Describes a collaborative software system that allows students from distant schools to share claims derived from their concept maps. Sharing takes place by accessing The Knowledge Soup, a repository of propositions submitted by students and stored on a computer server. Students can use propositions from other students to enhance their concept…

  16. Constant Aims, Changing Technologies: Photostat and Microfilm Publishing at the Massachusetts Historical Society.

    ERIC Educational Resources Information Center

    Wright, Conrad Edick

    1995-01-01

    Discusses the Massachusetts Historical Society's various photostat and microfilming projects and describes how they grew out of the philosophy of the society's founder, the Reverend Jeremy Belknap. Topics include the preservation of the papers of presidents John Adams and John Quincy Adams, and collaboration with other repositories. (Author/JKP)

  17. So You Want to Be Trustworthy: A Repository's Guide to Taking Reasonable Steps Towards Achieving ISO 16363

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2016-12-01

    To be trustworthy is to be reliable, dependable, honest, principled, ethical, incorruptible, and more. A trustworthy person demonstrates these qualities over time and under all circumstances. A trustworthy repository demonstrates these qualities through the team that manages the repository and its responsible organization. The requirements of a Trusted Digital Repository (TDR) in ISO 16363 can be tough to reach and tough to maintain. Challenges include: limited funds, limited resources and/or skills, and an unclear path to successfully achieve the requirements. The ISO standard defines each requirement separately, but a successful certification recognizes that there are many cross-dependencies among the requirements. Understanding these dependencies leads to a more efficient path towards success. At AGU we recognize that reaching the goal of the TDR ISO standard, or any set of data management objectives defined by an organization, has a better chance at success if the organization clearly knows their current capability, the improvements that are needed, and the best way to make (and maintain) those changes. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM model within the Earth and space sciences. Using the DMM, AGU developed a new Data Management Assessment Program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices to meet their goals - including becoming a Trustworthy Digital Repository. The requirements to achieve the TDR ISO standard are aligned to the data management best practices defined in the Data Management Maturity (DMM)SM model. Using the DMM as a process improvement tool in conjunction with the Data Management Assessment method, a team seeking the objective of the TDR ISO standard receives a clear road map to achieving their goal as an outcome of the assessment. Publishers and agencies are beginning to recommend or even require that repositories demonstrate that they are practicing best practices or meeting certain standards. Data preserved in a data facility that is working on achieving a TDR standard will have the level of care desired by the publishing community as well as the science community. Better Data Management results in Better Science.

  18. 78 FR 63455 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-24

    ..., Building 23, Columbus, OH 43213-1152. Defense Manpower Data Center, 400 Gigling Road, Seaside CA 93955... web-based system providing a repository of military, Government civilian and contractor personnel and..., tracking, reporting, evaluating program effectiveness and conducting research. The Total Operational...

  19. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  20. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  1. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  2. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  3. 10 CFR 60.140 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and it will continue until permanent closure. (c) The program shall include in situ monitoring, laboratory and field testing, and in situ experiments, as may be appropriate to accomplish the objective as... REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  4. Establishment and evolution of the Australian Inherited Retinal Disease Register and DNA Bank.

    PubMed

    De Roach, John N; McLaren, Terri L; Paterson, Rachel L; O'Brien, Emily C; Hoffmann, Ling; Mackey, David A; Hewitt, Alex W; Lamey, Tina M

    2013-07-01

    Inherited retinal disease represents a significant cause of blindness and visual morbidity worldwide. With the development of emerging molecular technologies, accessible and well-governed repositories of data characterising inherited retinal disease patients is becoming increasingly important. This manuscript introduces such a repository. Participants were recruited from the Retina Australia membership, through the Royal Australian and New Zealand College of Ophthalmologists, and by recruitment of suitable patients attending the Sir Charles Gairdner Hospital visual electrophysiology clinic. Four thousand one hundred ninety-three participants were recruited. All participants were members of families in which the proband was diagnosed with an inherited retinal disease (excluding age-related macular degeneration). Clinical and family information was collected by interview with the participant and by examination of medical records. In 2001, we began collecting DNA from Western Australian participants. In 2009 this activity was extended Australia-wide. Genetic analysis results were stored in the register as they were obtained. The main outcome measurement was the number of DNA samples (with associated phenotypic information) collected from Australian inherited retinal disease-affected families. DNA was obtained from 2873 participants. Retinitis pigmentosa, Stargardt disease and Usher syndrome participants comprised 61.0%, 9.9% and 6.4% of the register, respectively. This resource is a valuable tool for investigating the aetiology of inherited retinal diseases. As new molecular technologies are translated into clinical applications, this well-governed repository of clinical and genetic information will become increasingly relevant for tasks such as identifying candidates for gene-specific clinical trials. © 2012 The Authors. Clinical and Experimental Ophthalmology © 2012 Royal Australian and New Zealand College of Ophthalmologists.

  5. Using Controlled Vocabularies and Semantics to Improve Ocean Data Discovery (Invited)

    NASA Astrophysics Data System (ADS)

    Chandler, C. L.; Groman, R. C.; Shepherd, A.; Allison, M. D.; Kinkade, D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006, by combining the formerly independent data management offices for the U.S. GLOBal Ocean ECosystems Dynamics (GLOBEC) and U.S. Joint Global Ocean flux Study (JGOFS) programs. BCO-DMO staff members work with investigators to publish data from research projects funded by the NSF Geosciences Directorate (GEO) Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and Polar Programs (PLR) Antarctic Sciences Organisms & Ecosystems Program (ANT). Since 2006, researchers have been contributing new data to the BCO-DMO data system. As the data from new research efforts have been added to the data previously shared by U.S. GLOBEC and U.S. JGOFS researchers, the BCO-DMO system has developed into a rich repository of data from ocean, coastal, and Great Lakes research programs. The metadata records for the original research program data (prior to 2006) were stored in human-readable flat files of text, translated on-demand to Web-retrievable files. Beginning in 2006, the metadata records from multiple data systems managed by BCO-DMO were ingested into a relational database (MySQL). Since that time, efforts have been made to incorporate lists of controlled vocabulary terms for key information concepts stored in the MySQL database (e.g. names of research programs, deployments, instruments and measurements). This presents a challenge for a data system that includes legacy data and is continually expanding with the addition of new contributions. Over the years, BCO-DMO has developed a series of data delivery systems driven by the supporting metadata. Improved access to research data, a primary goal of the BCO-DMO project, is achieved through geospatial and text-based data access systems that support data discovery, access, display, assessment, integration, and export of data resources. The addition of a semantically-enabled search capability improves data discovery options particularly for those investigators whose research interests are cross-domain and multi-disciplinary. Current efforts by BCO-DMO staff members are focused on identifying globally unique, persistent identifiers to unambiguously identify resources of interest curated by and available from BCO-DMO. The process involves several essential components: (1) identifying a trusted authoritative source of complementary content and the appropriate contact; (2) determining the globally unique, persistent identifier system for resources of interest and (3) negotiating the requisite syntactic and semantic exchange systems. A variety of technologies have been deployed including: (1) controlled vocabulary term lists for some of the essential concepts/classes; (2) the Ocean Data Ontology; (3) publishing content as Linked Open Data and (4) SPARQL queries and inference. The final results are emerging as a semantic layer comprising domain-specific controlled vocabularies typed to community standard definitions, an ontology with the concepts and relationships needed to describe ocean data, a semantically-enabled faceted search, and inferencing services. We are exploring use of these technologies to improve the accuracy of the BCO-DMO data collection and to facilitate exchange of information with complementary ocean data repositories. Integrating a semantic layer into the BCO-DMO data system architecture improves data and information resource discovery, access and integration.

  6. A Predictive Approach to Eliminating Errors in Software Code

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA s Metrics Data Program Data Repository is a database that stores problem, product, and metrics data. The primary goal of this data repository is to provide project data to the software community. In doing so, the Metrics Data Program collects artifacts from a large NASA dataset, generates metrics on the artifacts, and then generates reports that are made available to the public at no cost. The data that are made available to general users have been sanitized and authorized for publication through the Metrics Data Program Web site by officials representing the projects from which the data originated. The data repository is operated by NASA s Independent Verification and Validation (IV&V) Facility, which is located in Fairmont, West Virginia, a high-tech hub for emerging innovation in the Mountain State. The IV&V Facility was founded in 1993, under the NASA Office of Safety and Mission Assurance, as a direct result of recommendations made by the National Research Council and the Report of the Presidential Commission on the Space Shuttle Challenger Accident. Today, under the direction of Goddard Space Flight Center, the IV&V Facility continues its mission to provide the highest achievable levels of safety and cost-effectiveness for mission-critical software. By extending its data to public users, the facility has helped improve the safety, reliability, and quality of complex software systems throughout private industry and other government agencies. Integrated Software Metrics, Inc., is one of the organizations that has benefited from studying the metrics data. As a result, the company has evolved into a leading developer of innovative software-error prediction tools that help organizations deliver better software, on time and on budget.

  7. Death Valley Lower Carbonate Aquifer Monitoring Program Wells Down Gradient of the Proposed Yucca Mountain Nuclear Waste Repository, U. S. Department of Energy Grant DE-RW0000233 2010 Project Report, prepared by The Hydrodynamics Group, LLC for Inyo County Yucca Mountain Repository Assessment Office

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Michael J; Bredehoeft, John D., Dr.

    2010-09-03

    Inyo County completed the first year of the U.S. Department of Energy Grant Agreement No. DE-RW0000233. This report presents the results of research conducted within this Grant agreement in the context of Inyo County's Yucca Mountain oversight program goals and objectives. The Hydrodynamics Group, LLC prepared this report for Inyo County Yucca Mountain Repository Assessment Office. The overall goal of Inyo County's Yucca Mountain research program is the evaluation of far-field issues related to potential transport, by ground water, of radionuclide into Inyo County, including Death Valley, and the evaluation of a connection between the Lower Carbonate Aquifer (LCA) andmore » the biosphere. Data collected within the Grant is included in interpretive illustrations and discussions of the results of our analysis. The centeral elements of this Grant prgoram was the drilling of exploratory wells, geophysical surveys, geological mapping of the Southern Funeral Mountain Range. The cullimination of this research was 1) a numerical ground water model of the Southern Funeral Mountain Range demonstrating the potential of a hydraulic connection between the LCA and the major springs in the Furnace Creek area of Death Valley, and 2) a numerical ground water model of the Amargosa Valley to evaluate the potential for radionuclide transport from Yucca Mountain to Inyo County, California. The report provides a description of research and activities performed by The Hydrodynamics Group, LLC on behalf of Inyo County, and copies of key work products in attachments to this report.« less

  8. Interfaces to PeptideAtlas: a case study of standard data access systems

    PubMed Central

    Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John

    2012-01-01

    Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959

  9. Tourism impacts of Three Mile Island and other adverse events: Implications for Lincoln County and other rural counties bisected by radioactive wastes intended for Yucca Mountain

    NASA Astrophysics Data System (ADS)

    Himmelberger, Jeffery J.; Baughman, Mike; Ogneva-Himmelberger, Yelena A.

    1995-11-01

    Whether the proposed Yucca Mountain nuclear waste repository system will adversely impact tourism in southern Nevada is an open question of particular importance to visitor-oriented rural counties bisected by planned waste transportatin corridors (highway or rail). As part of one such county's repository impact assessment program, tourism implications of Three Mile Island (TMI) and other major hazard events have beem revisited to inform ongoing county-wide socioeconomic assessments and contingency planning efforts. This paper summarizes key research implications of such research as applied to Lincoln County, Nevada. Implications for other rural counties are discussed in light of the research findings.

  10. Politics of nuclear waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colglazier, E.W. Jr.

    1982-01-01

    In November of 1979, the Program in Science, Technology and Humanism and the Energy Committee of the Aspen Institute organized a conference on resolving the social, political, and institutional conflicts over the permanent siting of radioactive wastes. This book was written as a result of this conference. The chapters provide a comprehensive and up-to-date overview of the governance issues connected with radioactive waste management as well as a sampling of the diverse views of the interested parties. Chapter 1 looks in depth of radioactive waste management in the United States, with special emphasis on the events of the Carter Administrationmore » as well as on the issues with which the Reagen administration must deal. Chapter 2 compares waste management policies and programs among the industralized countries. Chapter 3 examines the factional controversies in the last administration and Congress over nuclear waste issues. Chapter 4 examines the complex legal questions involved in the federal-state conflicts over nuclear waste management. Chapter 5 examines the concept of consultation and concurrence from the perspectives of a host state that is a candidate for a repository and an interested state that has special concerns regarding the demonstration of nuclear waste disposal technology. Chapter 6 examines US and European perspectives concerning public participation in nuclear waste management. Chapter 7 discusses propaganda in the issues. The epilogue attempts to assess the prospects for consensus in the United States on national policies for radioactive waste management. All of the chapter in this book should be interpreted as personal assessments. (DP)« less

  11. 10 CFR 63.143 - Implementation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Implementation. 63.143 Section 63.143 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Quality Assurance § 63.143 Implementation. DOE shall implement a quality assurance program...

  12. Scrubchem: Building Bioactivity Datasets from Pubchem Bioassay Data (SOT)

    EPA Science Inventory

    The PubChem Bioassay database is a non-curated public repository with data from 64 sources, including: ChEMBL, BindingDb, DrugBank, EPA Tox21, NIH Molecular Libraries Screening Program, and various other academic, government, and industrial contributors. Methods for extracting th...

  13. 10 CFR 63.132 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...

  14. 10 CFR 63.132 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...

  15. 10 CFR 63.132 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...

  16. 10 CFR 60.141 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...

  17. 10 CFR 60.141 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...

  18. 10 CFR 60.141 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... reported to the Commission. (e) In situ monitoring of the thermomechanical response of the underground... IN GEOLOGIC REPOSITORIES Performance Confirmation Program § 60.141 Confirmation of geotechnical and... needed in design to accommodate actual field conditions encountered. (b) Subsurface conditions shall be...

  19. 10 CFR 63.132 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...

  20. 10 CFR 63.132 - Confirmation of geotechnical and design parameters.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commission. (e) In situ monitoring of the thermomechanical response of the underground facility must be... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Performance Confirmation Program § 63.132... engineered systems and components, must be identified in the performance confirmation plan. (d) These...

  1. Field of genes: using Apache Kafka as a bioinformatic data repository

    PubMed Central

    Lynch, Richard; Walsh, Paul

    2018-01-01

    Abstract Background Bioinformatic research is increasingly dependent on large-scale datasets, accessed either from private or public repositories. An example of a public repository is National Center for Biotechnology Information's (NCBI’s) Reference Sequence (RefSeq). These repositories must decide in what form to make their data available. Unstructured data can be put to almost any use but are limited in how access to them can be scaled. Highly structured data offer improved performance for specific algorithms but limit the wider usefulness of the data. We present an alternative: lightly structured data stored in Apache Kafka in a way that is amenable to parallel access and streamed processing, including subsequent transformations into more highly structured representations. We contend that this approach could provide a flexible and powerful nexus of bioinformatic data, bridging the gap between low structure on one hand, and high performance and scale on the other. To demonstrate this, we present a proof-of-concept version of NCBI’s RefSeq database using this technology. We measure the performance and scalability characteristics of this alternative with respect to flat files. Results The proof of concept scales almost linearly as more compute nodes are added, outperforming the standard approach using files. Conclusions Apache Kafka merits consideration as a fast and more scalable but general-purpose way to store and retrieve bioinformatic data, for public, centralized reference datasets such as RefSeq and for private clinical and experimental data. PMID:29635394

  2. U.S. Virgin Islands Petroleum Price-Spike Preparation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C.

    2012-06-01

    This NREL technical report details a plan for the U.S. Virgin Islands (USVI) to minimize the economic damage caused by major petroleum price increases. The assumptions for this plan are that the USVI will have very little time and money to implement it and that the population will be highly motivated to follow it because of high fuel prices. The plan's success, therefore, is highly dependent on behavior change. This plan was derived largely from a review of the actions taken and behavior changes made by companies and commuters throughout the United States in response to the oil price spikemore » of 2008. Many of these solutions were coordinated by or reported through the 88 local representatives of the U.S. Department of Energy's Clean Cities program. The National Renewable Energy Laboratory provides technical and communications support for the Clean Cities program and therefore serves as a de facto repository of these solutions. This plan is the first publication that has tapped this repository.« less

  3. Case-oriented computer-based-training in radiology: concept, implementation and evaluation

    PubMed Central

    Dugas, Martin; Trumm, Christoph; Stäbler, Axel; Pander, Ernst; Hundt, Walter; Scheidler, Jurgen; Brüning, Roland; Helmberger, Thomas; Waggershauser, Tobias; Matzko, Matthias; Reiser, Maximillian

    2001-01-01

    Background Providing high-quality clinical cases is important for teaching radiology. We developed, implemented and evaluated a program for a university hospital to support this task. Methods The system was built with Intranet technology and connected to the Picture Archiving and Communications System (PACS). It contains cases for every user group from students to attendants and is structured according to the ACR-code (American College of Radiology) [2]. Each department member was given an individual account, could gather his teaching cases and put the completed cases into the common database. Results During 18 months 583 cases containing 4136 images involving all radiological techniques were compiled and 350 cases put into the common case repository. Workflow integration as well as individual interest influenced the personal efforts to participate but an increasing number of cases and minor modifications of the program improved user acceptance continuously. 101 students went through an evaluation which showed a high level of acceptance and a special interest in elaborate documentation. Conclusion Electronic access to reference cases for all department members anytime anywhere is feasible. Critical success factors are workflow integration, reliability, efficient retrieval strategies and incentives for case authoring. PMID:11686856

  4. Enabling long-term oceanographic research: Changing data practices, information management strategies and informatics

    NASA Astrophysics Data System (ADS)

    Baker, Karen S.; Chandler, Cynthia L.

    2008-09-01

    Interdisciplinary global ocean science requires new ways of thinking about data and data management. With new data policies and growing technological capabilities, datasets of increasing variety and complexity are being made available digitally and data management is coming to be recognized as an integral part of scientific research. To meet the changing expectations of scientists collecting data and of data reuse by others, collaborative strategies involving diverse teams of information professionals are developing. These changes are stimulating the growth of information infrastructures that support multi-scale sampling, data repositories, and data integration. Two examples of oceanographic projects incorporating data management in partnership with science programs are discussed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned from a decade of data management within these communities provide an experience base from which to develop information management strategies—short-term and long-term. Ocean Informatics provides one example of a conceptual framework for managing the complexities inherent to sharing oceanographic data. Elements are introduced that address the economies-of-scale and the complexities-of-scale pertinent to a broader vision of information management and scientific research.

  5. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  6. Adaptive dynamic programming approach to experience-based systems identification and control.

    PubMed

    Lendaris, George G

    2009-01-01

    Humans have the ability to make use of experience while selecting their control actions for distinct and changing situations, and their process speeds up and have enhanced effectiveness as more experience is gained. In contrast, current technological implementations slow down as more knowledge is stored. A novel way of employing Approximate (or Adaptive) Dynamic Programming (ADP) is described that shifts the underlying Adaptive Critic type of Reinforcement Learning method "up a level", away from designing individual (optimal) controllers to that of developing on-line algorithms that efficiently and effectively select designs from a repository of existing controller solutions (perhaps previously developed via application of ADP methods). The resulting approach is called Higher-Level Learning Algorithm. The approach and its rationale are described and some examples of its application are given. The notions of context and context discernment are important to understanding the human abilities noted above. These are first defined, in a manner appropriate to controls and system-identification, and as a foundation relating to the application arena, a historical view of the various phases during development of the controls field is given, organized by how the notion 'context' was, or was not, involved in each phase.

  7. Chemical Technology Division annual technical report, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battles, J.E.; Myles, K.M.; Laidler, J.J.

    1993-06-01

    In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous waste, mixed hazardous/radioactive waste, and municipal solid waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams, treating water contaminated with volatile organics, and concentrating radioactive waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (EFR); (7)more » processes for removal of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials (corium; Fe-U-Zr, tritium in LiAlO{sub 2} in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources and novel` ceramic precursors; materials chemistry of superconducting oxides, electrified metal/solution interfaces, and molecular sieve structures; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less

  8. OERScout Technology Framework: A Novel Approach to Open Educational Resources Search

    ERIC Educational Resources Information Center

    Abeywardena, Ishan Sudeera; Chan, Chee Seng; Tham, Choy Yoong

    2013-01-01

    The open educational resources (OER) movement has gained momentum in the past few years. With this new drive towards making knowledge open and accessible, a large number of OER repositories have been established and made available online throughout the world. However, the inability of existing search engines such as Google, Yahoo!, and Bing to…

  9. "Different Strokes for Different Folks": Presenting EAD in Three UK Online Catalogues

    ERIC Educational Resources Information Center

    Hill, Amanda; Stockting, Bill; Higgins, Sarah

    2005-01-01

    This article discusses three different online services providing federated access to finding aids relating to archives found in a number of repositories: the Archives Hub, Access to Archives (A2A) and Navigational Aids for the History of Science, Technology and the Environment (NAHSTE). While the scale of the services is very different, a…

  10. Core Clinical Data Elements for Cancer Genomic Repositories: A Multi-stakeholder Consensus.

    PubMed

    Conley, Robert B; Dickson, Dane; Zenklusen, Jean Claude; Al Naber, Jennifer; Messner, Donna A; Atasoy, Ajlan; Chaihorsky, Lena; Collyar, Deborah; Compton, Carolyn; Ferguson, Martin; Khozin, Sean; Klein, Roger D; Kotte, Sri; Kurzrock, Razelle; Lin, C Jimmy; Liu, Frank; Marino, Ingrid; McDonough, Robert; McNeal, Amy; Miller, Vincent; Schilsky, Richard L; Wang, Lisa I

    2017-11-16

    The Center for Medical Technology Policy and the Molecular Evidence Development Consortium gathered a diverse group of more than 50 stakeholders to develop consensus on a core set of data elements and values essential to understanding the clinical utility of molecularly targeted therapies in oncology. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. ASK4Labs: A Web-Based Repository for Supporting Learning Design Driven Remote and Virtual Labs Recommendations

    ERIC Educational Resources Information Center

    Zervas, Panagiotis; Fiskilis, Stefanos; Sampson, Demetrios G.

    2014-01-01

    Over the past years, Remote and Virtual Labs (RVLs) have gained increased attention for their potential to support technology-enhanced science education by enabling science teachers to improve their day-to-day science teaching. Therefore, many educational institutions and scientific organizations have invested efforts for providing online access…

  12. Generation of openEHR Test Datasets for Benchmarking.

    PubMed

    El Helou, Samar; Karvonen, Tuukka; Yamamoto, Goshiro; Kume, Naoto; Kobayashi, Shinji; Kondo, Eiji; Hiragi, Shusuke; Okamoto, Kazuya; Tamura, Hiroshi; Kuroda, Tomohiro

    2017-01-01

    openEHR is a widely used EHR specification. Given its technology-independent nature, different approaches for implementing openEHR data repositories exist. Public openEHR datasets are needed to conduct benchmark analyses over different implementations. To address their current unavailability, we propose a method for generating openEHR test datasets that can be publicly shared and used.

  13. The Intranet: A New Concept for Corporate Information Handling.

    ERIC Educational Resources Information Center

    Barbera, Jose

    The World Wide Web model has evolved within companies from a repository for notice boards to a new tool that can improve work productivity. Intranets, the internal or corporate internets, are likely to be the key information technology revolution for the remainder of this century. The intranet concept is derived from the present Internet as a…

  14. Information Technology and the Evolution of the Library

    DTIC Science & Technology

    2009-03-01

    Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise

  15. 76 FR 81517 - Submission for Review and Comment: “The Menlo Report: Ethical Principles Guiding Information and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-28

    ...'' (``Menlo Report'') for the Department of Homeland Security (DHS), Science and Technology, Cyber Security Division (CSD), Protected Repository for the Defense of Infrastructure Against Cyber Threats (PREDICT... be found at: http://www.cyber.st.dhs.gov/wp-content/uploads/2011/12/MenloPrinciplesCORE-20110915-r560...

  16. The siting program of geological repository for spent fuel/high-level waste in Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novotny, P.

    1993-12-31

    The management of high-level waste in Czech Republic have a very short history, because before the year 1989 spent nuclear fuel was re-exported back to USSR. The project ``Geological research of HLW repository in Czech Republic`` was initiated during 1990 by the Ministry of the Environment of the Czech Republic and by this project delegated the Czech Geological Survey (CGU) Prague. The first CGU project late in 1990 for multibarrier concept has proposed a geological repository to be located at a depth of about 500 m. Screening and studies of potential sites for repository started in 1991. First stage representedmore » regional siting of the Czech Republic for perspective rock types and massifs. In cooperation with GEOPHYSICS Co., Geophysical Institute of the Czech Academy of Sciences and Charles University Prague 27 perspective regions were selected, using criteria IAEA. This work in the Czech Republic was possible thanks to the detailed geological studies done in the past and thanks to the numerous archive data, concentrated in the central geological archive GEOFOND. Selection of perspective sites also respected natural conservation regions, regions conserving water and mineral waters resources. CGU opened up contact with countries with similar geological situation and started cooperation with SKB (Swedish Nuclear Fuel and Waste Management Co.). The Project of geological research for the next 10 years is a result of these activities.« less

  17. The Environmental Data Initiative data repository: Trustworthy practices that foster preservation, fitness, and reuse for environmental and ecological data

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    The Environmental Data Initiative (EDI) is an outgrowth of more than 30 years of information management experience and technology from LTER Network data practitioners. EDI builds upon the PASTA data repository software used by the LTER Network Information System and manages more than 42,000 data packages, containing tabular data, imagery, and other formats. Development of the repository was a community process beginning in 2009 that included numerous working groups for generating use cases, system requirements, and testing of completed software, thereby creating a vested interested in its success and transparency in design. All software is available for review on GitHub, and refinements and new features are ongoing. Documentation is also available on Read-the-docs, including a comprehensive description of all web-service API methods. PASTA is metadata driven and uses the Ecological Metadata Language (EML) standard for describing environmental and ecological data; a simplified Dublin Core document is also available for each data package. Data are aggregated into packages consisting of metadata and other related content described by an OAI-ORE document. Once archived, each data package becomes immutable and permanent; updates are possible through the addition of new revisions. Components of each data package are accessible through a unique identifier, while the entire data package receives a DOI that is registered in DataCite. Preservation occurs through a combination of DataONE synchronization/replication and by a series of local and remote backup strategies, including daily uploads to AWS Glacier storage. Checksums are computed for all data at initial upload, with random verification occurring on a continuous basis, thus ensuring the integrity of data. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML before data are archived; data packages that fail any test are forbidden in the repository. These tests are a measure data fitness, which ultimately increases confidence in data reuse and synthesis. The EDI data repository is recognized by multiple organizations, including EarthCube's Council of Data Facilities, the United States Geological Survey, FAIRsharing.org, re3data.org, and is a PLOS and Nature recommended data repository.

  18. JingleBells: A Repository of Immune-Related Single-Cell RNA-Sequencing Datasets.

    PubMed

    Ner-Gaon, Hadas; Melchior, Ariel; Golan, Nili; Ben-Haim, Yael; Shay, Tal

    2017-05-01

    Recent advances in single-cell RNA-sequencing (scRNA-seq) technology increase the understanding of immune differentiation and activation processes, as well as the heterogeneity of immune cell types. Although the number of available immune-related scRNA-seq datasets increases rapidly, their large size and various formats render them hard for the wider immunology community to use, and read-level data are practically inaccessible to the non-computational immunologist. To facilitate datasets reuse, we created the JingleBells repository for immune-related scRNA-seq datasets ready for analysis and visualization of reads at the single-cell level (http://jinglebells.bgu.ac.il/). To this end, we collected the raw data of publicly available immune-related scRNA-seq datasets, aligned the reads to the relevant genome, and saved aligned reads in a uniform format, annotated for cell of origin. We also added scripts and a step-by-step tutorial for visualizing each dataset at the single-cell level, through the commonly used Integrated Genome Viewer (www.broadinstitute.org/igv/). The uniform scRNA-seq format used in JingleBells can facilitate reuse of scRNA-seq data by computational biologists. It also enables immunologists who are interested in a specific gene to visualize the reads aligned to this gene to estimate cell-specific preferences for splicing, mutation load, or alleles. Thus JingleBells is a resource that will extend the usefulness of scRNA-seq datasets outside the programming aficionado realm. Copyright © 2017 by The American Association of Immunologists, Inc.

  19. Fundamentals of the NEA Thermochemical Database and its influence over national nuclear programs on the performance assessment of deep geological repositories.

    PubMed

    Ragoussi, Maria-Eleni; Costa, Davide

    2017-03-14

    For the last 30 years, the NEA Thermochemical Database (TDB) Project (www.oecd-nea.org/dbtdb/) has been developing a chemical thermodynamic database for elements relevant to the safety of radioactive waste repositories, providing data that are vital to support the geochemical modeling of such systems. The recommended data are selected on the basis of strict review procedures and are characterized by their consistency. The results of these efforts are freely available, and have become an international point of reference in the field. As a result, a number of important national initiatives with regard to waste management programs have used the NEA TDB as their basis, both in terms of recommended data and guidelines. In this article we describe the fundamentals and achievements of the project together with the characteristics of some databases developed in national nuclear waste disposal programs that have been influenced by the NEA TDB. We also give some insights on how this work could be seen as an approach to be used in broader areas of environmental interest. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. CellBase, a comprehensive collection of RESTful web services for retrieving relevant biological information from heterogeneous sources.

    PubMed

    Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio

    2012-07-01

    During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.

  1. France's State of the Art Distributed Optical Fibre Sensors Qualified for the Monitoring of the French Underground Repository for High Level and Intermediate Level Long Lived Radioactive Wastes.

    PubMed

    Delepine-Lesoille, Sylvie; Girard, Sylvain; Landolt, Marcel; Bertrand, Johan; Planes, Isabelle; Boukenter, Aziz; Marin, Emmanuel; Humbert, Georges; Leparmentier, Stéphanie; Auguste, Jean-Louis; Ouerdane, Youcef

    2017-06-13

    This paper presents the state of the art distributed sensing systems, based on optical fibres, developed and qualified for the French Cigéo project, the underground repository for high level and intermediate level long-lived radioactive wastes. Four main parameters, namely strain, temperature, radiation and hydrogen concentration are currently investigated by optical fibre sensors, as well as the tolerances of selected technologies to the unique constraints of the Cigéo's severe environment. Using fluorine-doped silica optical fibre surrounded by a carbon layer and polyimide coating, it is possible to exploit its Raman, Brillouin and Rayleigh scattering signatures to achieve the distributed sensing of the temperature and the strain inside the repository cells of radioactive wastes. Regarding the dose measurement, promising solutions are proposed based on Radiation Induced Attenuation (RIA) responses of sensitive fibres such as the P-doped ones. While for hydrogen measurements, the potential of specialty optical fibres with Pd particles embedded in their silica matrix is currently studied for this gas monitoring through its impact on the fibre Brillouin signature evolution.

  2. Entrez Neuron RDFa: a pragmatic Semantic Web application for data integration in neuroscience research

    PubMed Central

    Samwald, Matthias; Lim, Ernest; Masiar, Peter; Marenco, Luis; Chen, Huajun; Morse, Thomas; Mutalik, Pradeep; Shepherd, Gordon; Miller, Perry; Cheung, Kei-Hoi

    2013-01-01

    The amount of biomedical data available in Semantic Web formats has been rapidly growing in recent years. While these formats are machine-friendly, user-friendly web interfaces allowing easy querying of these data are typically lacking. We present “Entrez Neuron”, a pilot neuron-centric interface that allows for keyword-based queries against a coherent repository of OWL ontologies. These ontologies describe neuronal structures, physiology, mathematical models and microscopy images. The returned query results are organized hierarchically according to brain architecture. Where possible, the application makes use of entities from the Open Biomedical Ontologies (OBO) and the ‘HCLS knowledgebase’ developed by the W3C Interest Group for Health Care and Life Science. It makes use of the emerging RDFa standard to embed ontology fragments and semantic annotations within its HTML-based user interface. The application and underlying ontologies demonstrates how Semantic Web technologies can be used for information integration within a curated information repository and between curated information repositories. It also demonstrates how information integration can be accomplished on the client side, through simple copying and pasting of portions of documents that contain RDFa markup. PMID:19745321

  3. NCBI2RDF: enabling full RDF-based access to NCBI databases.

    PubMed

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  4. France’s State of the Art Distributed Optical Fibre Sensors Qualified for the Monitoring of the French Underground Repository for High Level and Intermediate Level Long Lived Radioactive Wastes

    PubMed Central

    Delepine-Lesoille, Sylvie; Girard, Sylvain; Landolt, Marcel; Bertrand, Johan; Planes, Isabelle; Boukenter, Aziz; Marin, Emmanuel; Humbert, Georges; Leparmentier, Stéphanie; Auguste, Jean-Louis; Ouerdane, Youcef

    2017-01-01

    This paper presents the state of the art distributed sensing systems, based on optical fibres, developed and qualified for the French Cigéo project, the underground repository for high level and intermediate level long-lived radioactive wastes. Four main parameters, namely strain, temperature, radiation and hydrogen concentration are currently investigated by optical fibre sensors, as well as the tolerances of selected technologies to the unique constraints of the Cigéo’s severe environment. Using fluorine-doped silica optical fibre surrounded by a carbon layer and polyimide coating, it is possible to exploit its Raman, Brillouin and Rayleigh scattering signatures to achieve the distributed sensing of the temperature and the strain inside the repository cells of radioactive wastes. Regarding the dose measurement, promising solutions are proposed based on Radiation Induced Attenuation (RIA) responses of sensitive fibres such as the P-doped ones. While for hydrogen measurements, the potential of specialty optical fibres with Pd particles embedded in their silica matrix is currently studied for this gas monitoring through its impact on the fibre Brillouin signature evolution. PMID:28608831

  5. An Intelligent Cloud Storage Gateway for Medical Imaging.

    PubMed

    Viana-Ferreira, Carlos; Guerra, António; Silva, João F; Matos, Sérgio; Costa, Carlos

    2017-09-01

    Historically, medical imaging repositories have been supported by indoor infrastructures. However, the amount of diagnostic imaging procedures has continuously increased over the last decades, imposing several challenges associated with the storage volume, data redundancy and availability. Cloud platforms are focused on delivering hardware and software services over the Internet, becoming an appealing solution for repository outsourcing. Although this option may bring financial and technological benefits, it also presents new challenges. In medical imaging scenarios, communication latency is a critical issue that still hinders the adoption of this paradigm. This paper proposes an intelligent Cloud storage gateway that optimizes data access times. This is achieved through a new cache architecture that combines static rules and pattern recognition for eviction and prefetching. The evaluation results, obtained from experiments over a real-world dataset, show that cache hit ratios can reach around 80%, leading to reductions of image retrieval times by over 60%. The combined use of eviction and prefetching policies proposed can significantly reduce communication latency, even when using a small cache in comparison to the total size of the repository. Apart from the performance gains, the proposed system is capable of adjusting to specific workflows of different institutions.

  6. Current Status of the Nuclear Waste Management Programme in Finland - 13441

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehto, Kimmo; Vuorio, Petteri

    2013-07-01

    Pursuant to the Decision-in-Principle of 2001 the Finnish programme for geologic disposal of spent fuel has now moved to the phase of applying for construction licence to build up the encapsulation plant and underground repository. The main objective of former programme phase, underground characterisation phase, was to confirm - or refute - the suitability of the Olkiluoto site by investigations conducted underground at the actual depth of the repository. The construction work of the access tunnel to the rock characterisation facility (ONKALO) started in the late summer of 2004. The site research and investigations work aimed at the maturity neededmore » for submission of the application for construction license of the actual repository in end of 2012. This requires, however, that also the technology has reached the maturity needed. The design and technical plans form the necessary platform for the development of the safety case for spent fuel disposal. A plan, 'road map', has been produced for the portfolio of reports that demonstrates the safety of disposal as required by the criteria set by the government and further detailed by the safety authority, STUK. (authors)« less

  7. SINGLE HEATER TEST FINAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.B. Cho

    The Single Heater Test is the first of the in-situ thermal tests conducted by the U.S. Department of Energy as part of its program of characterizing Yucca Mountain in Nevada as the potential site for a proposed deep geologic repository for the disposal of spent nuclear fuel and high-level nuclear waste. The Site Characterization Plan (DOE 1988) contained an extensive plan of in-situ thermal tests aimed at understanding specific aspects of the response of the local rock-mass around the potential repository to the heat from the radioactive decay of the emplaced waste. With the refocusing of the Site Characterization Planmore » by the ''Civilian Radioactive Waste Management Program Plan'' (DOE 1994), a consolidated thermal testing program emerged by 1995 as documented in the reports ''In-Situ Thermal Testing Program Strategy'' (DOE 1995) and ''Updated In-Situ Thermal Testing Program Strategy'' (CRWMS M&O 1997a). The concept of the Single Heater Test took shape in the summer of 1995 and detailed planning and design of the test started with the beginning fiscal year 1996. The overall objective of the Single Heater Test was to gain an understanding of the coupled thermal, mechanical, hydrological, and chemical processes that are anticipated to occur in the local rock-mass in the potential repository as a result of heat from radioactive decay of the emplaced waste. This included making a priori predictions of the test results using existing models and subsequently refining or modifying the models, on the basis of comparative and interpretive analyses of the measurements and predictions. A second, no less important, objective was to try out, in a full-scale field setting, the various instruments and equipment to be employed in the future on a much larger, more complex, thermal test of longer duration, such as the Drift Scale Test. This ''shake down'' or trial aspect of the Single Heater Test applied not just to the hardware, but also to the teamwork and cooperation between multiple organizations performing their part in the test.« less

  8. Images of a place and vacation preferences: Implications of the 1989 surveys for assessing the economic impacts of a nuclear waste repository in Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slovic, P.; Layman, M.; Flynn, J.H.

    1990-11-01

    In July, 1989 the authors produced a report titled Perceived Risk, Stigma, and Potential Economic Impacts of a High-Level Nuclear-Waste Repository in Nevada (Slovic et al., 1989). That report described a program of research designed to assess the potential impacts of a high-level nuclear waste repository at Yucca Mountain, Nevada upon tourism, retirement and job-related migration, and business development in Las Vegas and the state. It was concluded that adverse economic impacts potentially may result from two related social processes. Specifically, the study by Slovic et al. employed analyses of imagery in order to overcome concerns about the validity ofmore » direct questions regarding the influence of a nuclear-waste repository at Yucca Mountain upon a person`s future behaviors. During the latter months of 1989, data were collected in three major telephone surveys, designed to achieve the following objectives: (1) to replicate the results from the Phoenix, Arizona, surveys using samples from other populations that contribute to tourism, migration, and development in Nevada; (2) to retest the original Phoenix respondents to determine the stability of their images across an 18-month time period and to determine whether their vacation choices subsequent to the first survey were predictable from the images they produced in that original survey; (3) to elicit additional word-association images for the stimulus underground nuclear waste repository in order to determine whether the extreme negative images generated by the Phoenix respondents would occur with other samples of respondents; and (4) to develop and test a new method for imagery elicitation, based upon a rating technique rather than on word associations. 2 refs., 8 figs., 13 tabs.« less

  9. 75 FR 71133 - National Institute of Mental Health; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Emphasis Panel; Competitive Revision for Stem Cell Repository Relevant to Mental Disorders. Date: December... Domestic Assistance Program Nos. 93.242, Mental Health Research Grants; 93.281, Scientist Development Award, Scientist Development Award for Clinicians, and Research Scientist Award; 93.282, Mental Health National...

  10. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...

  11. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...

  12. 10 CFR 60.142 - Design testing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2010-01-01 2010-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  13. 10 CFR 60.142 - Design testing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2013-01-01 2013-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  14. 10 CFR 60.142 - Design testing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2012-01-01 2012-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  15. 10 CFR 60.142 - Design testing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2014-01-01 2014-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  16. 10 CFR 60.142 - Design testing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... construction, a program for in situ testing of such features as borehole and shaft seals, backfill, and the... 10 Energy 2 2011-01-01 2011-01-01 false Design testing. 60.142 Section 60.142 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES...

  17. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  18. Brine and Gas Flow Patterns Between Excavated Areas and Disturbed Rock Zone in the 1996 Performance Assessment for the Waste Isolation Pilot Plant for a Single Drilling Intrusion that Penetrates Repository and Castile Brine Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ECONOMY,KATHLEEN M.; HELTON,JON CRAIG; VAUGHN,PALMER

    1999-10-01

    The Waste Isolation Pilot Plant (WIPP), which is located in southeastern New Mexico, is being developed for the geologic disposal of transuranic (TRU) waste by the U.S. Department of Energy (DOE). Waste disposal will take place in panels excavated in a bedded salt formation approximately 2000 ft (610 m) below the land surface. The BRAGFLO computer program which solves a system of nonlinear partial differential equations for two-phase flow, was used to investigate brine and gas flow patterns in the vicinity of the repository for the 1996 WIPP performance assessment (PA). The present study examines the implications of modeling assumptionsmore » used in conjunction with BRAGFLO in the 1996 WIPP PA that affect brine and gas flow patterns involving two waste regions in the repository (i.e., a single waste panel and the remaining nine waste panels), a disturbed rock zone (DRZ) that lies just above and below these two regions, and a borehole that penetrates the single waste panel and a brine pocket below this panel. The two waste regions are separated by a panel closure. The following insights were obtained from this study. First, the impediment to flow between the two waste regions provided by the panel closure model is reduced due to the permeable and areally extensive nature of the DRZ adopted in the 1996 WIPP PA, which results in the DRZ becoming an effective pathway for gas and brine movement around the panel closures and thus between the two waste regions. Brine and gas flow between the two waste regions via the DRZ causes pressures between the two to equilibrate rapidly, with the result that processes in the intruded waste panel are not isolated from the rest of the repository. Second, the connection between intruded and unintruded waste panels provided by the DRZ increases the time required for repository pressures to equilibrate with the overlying and/or underlying units subsequent to a drilling intrusion. Third, the large and areally extensive DRZ void volumes is a significant source of brine to the repository, which is consumed in the corrosion of iron and thus contributes to increased repository pressures. Fourth, the DRZ itself lowers repository pressures by providing storage for gas and access to additional gas storage in areas of the repository. Fifth, given the pathway that the DRZ provides for gas and brine to flow around the panel closures, isolation of the waste panels by the panel closures was not essential to compliance with the U.S. Environment Protection Agency's regulations in the 1996 WIPP PA.« less

  19. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

    PubMed Central

    Wu, Tai-luan; Tseng, Ling-li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327

  20. Cross-Cutting Risk Framework: Mining Data for Common Risks Across the Portfolio

    NASA Technical Reports Server (NTRS)

    Klein, Gerald A., Jr.; Ruark, Valerie

    2017-01-01

    The National Aeronautics and Space Administration (NASA) defines risk management as an integrated framework, combining risk-informed decision making and continuous risk management to foster forward-thinking and decision making from an integrated risk perspective. Therefore, decision makers must have access to risks outside of their own project to gain the knowledge that provides the integrated risk perspective. Through the Goddard Space Flight Center (GSFC) Flight Projects Directorate (FPD) Business Change Initiative (BCI), risks were integrated into one repository to facilitate access to risk data between projects. With the centralized repository, communications between the FPD, project managers, and risk managers improved and GSFC created the cross-cutting risk framework (CCRF) team. The creation of the consolidated risk repository, in parallel with the initiation of monthly FPD risk managers and risk governance board meetings, are now providing a complete risk management picture spanning the entire directorate. This paper will describe the challenges, methodologies, tools, and techniques used to develop the CCRF, and the lessons learned as the team collectively worked to identify risks that FPD programs projects had in common, both past and present.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.W.; Cook, J.L.

    One of the most powerful tools available for telemedicine is a multimedia medical record accessible over a wide area and simultaneously editable by multiple physicians. The ability to do this through an intuitive interface linking multiple distributed data repositories while maintaining full data integrity is a fundamental enabling technology in healthcare. The authors discuss the role of distributed object technology using Java and CORBA in providing this capability including an example of such a system (TeleMed) which can be accessed through the World Wide Web. Issues of security, scalability, data integrity, and usability are emphasized.

  2. The role of CORBA in enabling telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.W.

    1997-07-01

    One of the most powerful tools available for telemedicine is a multimedia medical record accessible over a wide area and simultaneously editable by multiple physicians. The ability to do this through an intuitive interface linking multiple distributed data repositories while maintaining full data integrity is a fundamental enabling technology in healthcare. The author discusses the role of distributed object technology using CORBA in providing this capability including an example of such a system (TeleMed) which can be accessed through the World Wide Web. Issues of security, scalability, data integrity, and useability are emphasized.

  3. BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.

    PubMed

    Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge

    2015-12-15

    BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. If We Share Data, Will Anyone Use Them? Data Sharing and Reuse in the Long Tail of Science and Technology

    PubMed Central

    Wallis, Jillian C.; Rolando, Elizabeth; Borgman, Christine L.

    2013-01-01

    Research on practices to share and reuse data will inform the design of infrastructure to support data collection, management, and discovery in the long tail of science and technology. These are research domains in which data tend to be local in character, minimally structured, and minimally documented. We report on a ten-year study of the Center for Embedded Network Sensing (CENS), a National Science Foundation Science and Technology Center. We found that CENS researchers are willing to share their data, but few are asked to do so, and in only a few domain areas do their funders or journals require them to deposit data. Few repositories exist to accept data in CENS research areas.. Data sharing tends to occur only through interpersonal exchanges. CENS researchers obtain data from repositories, and occasionally from registries and individuals, to provide context, calibration, or other forms of background for their studies. Neither CENS researchers nor those who request access to CENS data appear to use external data for primary research questions or for replication of studies. CENS researchers are willing to share data if they receive credit and retain first rights to publish their results. Practices of releasing, sharing, and reusing of data in CENS reaffirm the gift culture of scholarship, in which goods are bartered between trusted colleagues rather than treated as commodities. PMID:23935830

  5. If we share data, will anyone use them? Data sharing and reuse in the long tail of science and technology.

    PubMed

    Wallis, Jillian C; Rolando, Elizabeth; Borgman, Christine L

    2013-01-01

    Research on practices to share and reuse data will inform the design of infrastructure to support data collection, management, and discovery in the long tail of science and technology. These are research domains in which data tend to be local in character, minimally structured, and minimally documented. We report on a ten-year study of the Center for Embedded Network Sensing (CENS), a National Science Foundation Science and Technology Center. We found that CENS researchers are willing to share their data, but few are asked to do so, and in only a few domain areas do their funders or journals require them to deposit data. Few repositories exist to accept data in CENS research areas.. Data sharing tends to occur only through interpersonal exchanges. CENS researchers obtain data from repositories, and occasionally from registries and individuals, to provide context, calibration, or other forms of background for their studies. Neither CENS researchers nor those who request access to CENS data appear to use external data for primary research questions or for replication of studies. CENS researchers are willing to share data if they receive credit and retain first rights to publish their results. Practices of releasing, sharing, and reusing of data in CENS reaffirm the gift culture of scholarship, in which goods are bartered between trusted colleagues rather than treated as commodities.

  6. Information warehouse - a comprehensive informatics platform for business, clinical, and research applications.

    PubMed

    Kamal, Jyoti; Liu, Jianhua; Ostrander, Michael; Santangelo, Jennifer; Dyta, Ravi; Rogers, Patrick; Mekhjian, Hagop S

    2010-11-13

    Since its inception in 1997, the IW (Information Warehouse) at the Ohio State University Medical Center (OSUMC) has gradually transformed itself from a single purpose business decision support system to a comprehensive informatics platform supporting basic, clinical, and translational research. The IW today is the combination of four integrated components: a clinical data repository containing over a million patients; a research data repository housing various research specific data; an application development platform for building business and research enabling applications; a business intelligence environment assisting in reporting in all function areas. The IW is structured and encoded using standard terminologies such as SNOMED-CT, ICD, and CPT. The IW is an important component of OSUMC's Clinical and Translational Science Award (CTSA) informatics program.

  7. The tropical germplasm repository program at the USDA-ARS, Tropical Agriculture Research Station, Mayaguez, Puerto Rico

    USDA-ARS?s Scientific Manuscript database

    The USDA-ARS Tropical Agriculture Research Station is the only research entity within the National Plant Germplasm system in the insular Caribbean region. It houses germplasm collections of cultivated tropical/subtropical germplasm of bananas/plantains, cacao, mamey sapote, sapodilla, Spanish lime,...

  8. USAF Hearing Conservation Program, DOEHRS Data Repository Annual Report: CY2014

    DTIC Science & Technology

    2016-02-01

    tinnitus . The goal was to align the DOEHRS-HC DR data with DoD Hearing Conservation and Readiness Working Group initiatives and Government...Accountability Office recommendations [3]. The data collected from the standardized tinnitus questions are projected to be mined by the DoD in future studies

  9. At the Creation: Chaos, Control, and Automation--Commercial Software Development for Archives.

    ERIC Educational Resources Information Center

    Drr, W. Theodore

    1988-01-01

    An approach to the design of flexible text-based management systems for archives includes tiers for repository, software, and user management systems. Each tier has four layers--objective, program, result, and interface. Traps awaiting software development companies involve the market, competition, operations, and finance. (10 references) (MES)

  10. Jean C. Zenklusen, M.S., Ph.D., Discusses the NCI Genomics Data Commons at AACR 2014 - TCGA

    Cancer.gov

    At the AACR 2014 meeting, Dr. Jean C. Zenklusen, Director of The Cancer Genome Atlas Program Office, highlights the Genomics Data Commons, a harmonized data repository that will allow simultaneous access and analysis of NCI genomics data, including The Ca

  11. 75 FR 35087 - Violent Criminal Apprehension Program; Agency Information Collection Activities: Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... in 1985, ViCAP serves as the national repository for violent crimes; specifically: Homicides and attempted homicides, especially those that (a) involve an abduction, (b) are apparently random, motiveless... homicide. Comprehensive case information submitted to ViCAP is maintained in the ViCAP Web National Crime...

  12. 75 FR 52027 - Violent Criminal Apprehension Program: Agency Information Collection Activities: Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-24

    ... 1985, ViCAP serves as the national repository for violent crimes; specifically: Homicides and attempted homicides, especially those that (a) involve an abduction, (b) are apparently random, motiveless, or... missing. Unidentified human remains, where the manner of death is known or suspected to be homicide...

  13. System Description and Status Report: California Education Information System.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    The California Education Information System (CEIS) consists of two subsystems of computer programs designed to process business and pupil data for local school districts. Creating and maintaining records concerning the students in the schools, the pupil subsystem provides for a central repository of school district identification information and a…

  14. The safety improvement of Romanian radioactive waste facilities as an example for human and environmental protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barariu, Gheorghe

    2013-07-01

    According to IAEA classification, Romania with two nuclear research centres, with 2 Nuclear Power Units in operation at Cernavoda Town and with 2 new Units envisaged to be in operation soon, can be considered as a country with an average nuclear activity. In Romania there was an extensive interest in management of radioactive wastes generated by the use of nuclear technology in industry and research. Using the most advanced technologies in the mentioned time periods, Romania successfully accomplished to solve all management issues related to radioactive wastes being addressed all safety concerns. Every step of nuclear activity development was accompaniedmore » by the suitable waste management facilities. So that, in order to improve the existing treatment and disposal capacities for institutional waste, the existing Radioactive Waste Treatment Facility (STDR) and the National Repository Radioactive Wastes (DNDR) at Baita, Bihor, will be improved to actual requirements on the occasion of VVR-S Research Reactor decommissioning. This activity is in development into the frame of a National funded project related to disposal galleries filling improvement and repository closure for DNDR Baita, Bihor. All improvements will be approved by Environmental Protection Authority and Regulatory Body, being a guaranty of human and environmental protection. Also, in accordance with national specific and international policies and taking into account decommissioning activities related to the present operating NPPs, all necessary measures were considered in order to avoid unnecessary generation of radioactive wastes, to minimize, as much as possible, waste production and accumulation and the necessity to develop optimum solutions for a new repository with the assurance of improved nuclear safety. (authors)« less

  15. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    PubMed Central

    2011-01-01

    Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807

  16. The SpeX Prism Library for Ultracool Dwarfs: A Resource for Stellar, Exoplanet and Galactic Science and Student-Led Research

    NASA Astrophysics Data System (ADS)

    Burgasser, Adam

    The NASA Infrared Telescope Facility's (IRTF) SpeX spectrograph has been an essential tool in the discovery and characterization of ultracool dwarf (UCD) stars, brown dwarfs and exoplanets. Over ten years of SpeX data have been collected on these sources, and a repository of low-resolution (R 100) SpeX prism spectra has been maintained by the PI at the SpeX Prism Spectral Libraries website since 2008. As the largest existing collection of NIR UCD spectra, this repository has facilitated a broad range of investigations in UCD, exoplanet, Galactic and extragalactic science, contributing to over 100 publications in the past 6 years. However, this repository remains highly incomplete, has not been uniformly calibrated, lacks sufficient contextual data for observations and sources, and most importantly provides no data visualization or analysis tools for the user. To fully realize the scientific potential of these data for community research, we propose a two-year program to (1) calibrate and expand existing repository and archival data, and make it virtual-observatory compliant; (2) serve the data through a searchable web archive with basic visualization tools; and (3) develop and distribute an open-source, Python-based analysis toolkit for users to analyze the data. These resources will be generated through an innovative, student-centered research model, with undergraduate and graduate students building and validating the analysis tools through carefully designed coding challenges and research validation activities. The resulting data archive, the SpeX Prism Library, will be a legacy resource for IRTF and SpeX, and will facilitate numerous investigations using current and future NASA capabilities. These include deep/wide surveys of UCDs to measure Galactic structure and chemical evolution, and probe UCD populations in satellite galaxies (e.g., JWST, WFIRST); characterization of directly imaged exoplanet spectra (e.g., FINESSE), and development of low-temperature theoretical models of UCD and exoplanet atmospheres. Our program will also serve to validate the IRTF data archive during its development, by reducing and disseminating non-proprietary archival observations of UCDs to the community. The proposed program directly addresses NASA's strategic goals of exploring the origin and evolution of stars and planets that make up our universe, and discovering and studying planets around other stars.

  17. Determination of Uncertainties for +III and +IV Actinide Solubilities in the WIPP Geochemistry Model for the 2009 Compliance Recertification Application

    NASA Astrophysics Data System (ADS)

    Ismail, A. E.; Xiong, Y.; Nowak, E. J.; Brush, L. H.

    2009-12-01

    The Waste Isolation Pilot Plant (WIPP) is a U.S. Department of Energy (DOE) repository in southeast New Mexico for defense-related transuranic (TRU) waste. Every five years, the DOE is required to submit an application to the Environmental Protection Agency (EPA) demonstrating the WIPP’s continuing compliance with the applicable EPA regulations governing the repository. Part of this recertification effort involves a performance assessment—a probabilistic evaluation of the repository performance with respect to regulatory limits on the amount of releases from the repository to the accessible environment. One of the models used as part of the performance assessment process is a geochemistry model, which predicts solubilities of the radionuclides in the brines that may enter the repository in the different scenarios considered by the performance assessment. The dissolved actinide source term comprises actinide solubilities, which are input parameters for modeling the transport of radionuclides as a result of brine flow through and from the repository. During a performance assessment, the solubilities are modeled as the product of a “base” solubility determined from calculations based on the chemical conditions expected in the repository, and an uncertainty factor that describes the potential deviations of the model from expected behavior. We will focus here on a discussion of the uncertainties. To compute a cumulative distribution function (CDF) for the uncertainties, we compare published, experimentally measured solubility data to predictions made using the established WIPP geochemistry model. The differences between the solubilities observed for a given experiment and the calculated solubilities from the model are used to form the overall CDF, which is then sampled as part of the performance assessment. We will discuss the methodology used to update the CDF’s for the +III actinides, obtained from data for Nd, Am, and Cm, and the +IV actinides, obtained from data for Th, and present results for the calculations of the updated CDF’s. We compare the CDF’s to the distributions computed for the previous recertification, and discuss the potential impact of the changes on the geochemistry model. This research is funded by WIPP programs administered by the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Excess plutonium disposition: The deep borehole option

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, K.L.

    1994-08-09

    This report reviews the current status of technologies required for the disposition of plutonium in Very Deep Holes (VDH). It is in response to a recent National Academy of Sciences (NAS) report which addressed the management of excess weapons plutonium and recommended three approaches to the ultimate disposition of excess plutonium: (1) fabrication and use as a fuel in existing or modified reactors in a once-through cycle, (2) vitrification with high-level radioactive waste for repository disposition, (3) burial in deep boreholes. As indicated in the NAS report, substantial effort would be required to address the broad range of issues relatedmore » to deep bore-hole emplacement. Subjects reviewed in this report include geology and hydrology, design and engineering, safety and licensing, policy decisions that can impact the viability of the concept, and applicable international programs. Key technical areas that would require attention should decisions be made to further develop the borehole emplacement option are identified.« less

  19. An Architecture Based on Linked Data Technologies for the Integration and Reuse of OER in MOOCs Context

    ERIC Educational Resources Information Center

    Piedra, Nelson; Chicaiza, Janneth Alexandra; López, Jorge; Tovar, Edmundo

    2014-01-01

    The Linked Data initiative is considered as one of the most effective alternatives for creating global shared information spaces, it has become an interesting approach for discovering and enriching open educational resources data, as well as achieving semantic interoperability and re-use between multiple OER repositories. The notion of Linked Data…

  20. The Classification and Evaluation of Computer-Aided Software Engineering Tools

    DTIC Science & Technology

    1990-09-01

    International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and

  1. Advanced Development and Dissemination of EMERSE for Cancer Phenotyping from Medical Records | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    The increasing use of electronic health records (EHRs) by cancer centers nationwide has led to the tremendous growth of repositories containing unstructured, free text notes. These notes include clinical concepts that cannot be found anywhere else in the EHR, and these concepts are needed to characterize a patient’s specific ‘phenotype’.

  2. User-Oriented Quality for OER: Understanding Teachers' Views on Re-Use, Quality, and Trust

    ERIC Educational Resources Information Center

    Clements, K. I.; Pawlowski, J. M.

    2012-01-01

    We analysed how teachers as users of open educational resources (OER) repositories act in the re-use process and how they perceive quality. Based on a quantitative empirical study, we also surveyed which quality requirements users have and how they would contribute to the quality process. Trust in resources, organizations, and technologies seem to…

  3. Datasets Reflecting Students' and Teachers' Views on the Use of Learning Technology in a UK University

    ERIC Educational Resources Information Center

    Limniou, Maria; Downes, John J.; Maskell, Simon

    2015-01-01

    Nowadays, the use of datasets is of crucial importance for the advancement of educational research. Specifically in the field of Higher Education, many researchers might share through online data repositories their research outputs in order for data to be reusable, accessible and accountable to educational community. The aim of this paper is to…

  4. Experiences with the BSCW Shared Workspace System as the Backbone of a Virtual Learning Environment for Students.

    ERIC Educational Resources Information Center

    Appelt, Wolfgang; Mambrey, Peter

    The GMD (German National Research Center for Information Technology) has developed the BSCW (Basic Support for Cooperative Work) Shared Workspace system within the last four years with the goal of transforming the Web from a primarily passive information repository to an active cooperation medium. The BSCW system is a Web-based groupware tool for…

  5. Learning Object Repositories in e-Learning: Challenges for Learners in Saudi Arabia

    ERIC Educational Resources Information Center

    AlMegren, Abdullah; Yassin, Siti Zuraiyni

    2013-01-01

    The advent of the millennium has seen the introduction of a new paradigm for ICT-enhanced education. Advances in ICT have led to the emergence of learning networks comprising people who want to discover and share various innovative technologies on a global scale. Over the past decade, there has been tremendous worldwide interest in the concept of…

  6. PGP repository: a plant phenomics and genomics data publication infrastructure

    PubMed Central

    Arend, Daniel; Junker, Astrid; Scholz, Uwe; Schüler, Danuta; Wylie, Juliane; Lange, Matthias

    2016-01-01

    Plant genomics and phenomics represents the most promising tools for accelerating yield gains and overcoming emerging crop productivity bottlenecks. However, accessing this wealth of plant diversity requires the characterization of this material using state-of-the-art genomic, phenomic and molecular technologies and the release of subsequent research data via a long-term stable, open-access portal. Although several international consortia and public resource centres offer services for plant research data management, valuable digital assets remains unpublished and thus inaccessible to the scientific community. Recently, the Leibniz Institute of Plant Genetics and Crop Plant Research and the German Plant Phenotyping Network have jointly initiated the Plant Genomics and Phenomics Research Data Repository (PGP) as infrastructure to comprehensively publish plant research data. This covers in particular cross-domain datasets that are not being published in central repositories because of its volume or unsupported data scope, like image collections from plant phenotyping and microscopy, unfinished genomes, genotyping data, visualizations of morphological plant models, data from mass spectrometry as well as software and documents. The repository is hosted at Leibniz Institute of Plant Genetics and Crop Plant Research using e!DAL as software infrastructure and a Hierarchical Storage Management System as data archival backend. A novel developed data submission tool was made available for the consortium that features a high level of automation to lower the barriers of data publication. After an internal review process, data are published as citable digital object identifiers and a core set of technical metadata is registered at DataCite. The used e!DAL-embedded Web frontend generates for each dataset a landing page and supports an interactive exploration. PGP is registered as research data repository at BioSharing.org, re3data.org and OpenAIRE as valid EU Horizon 2020 open data archive. Above features, the programmatic interface and the support of standard metadata formats, enable PGP to fulfil the FAIR data principles—findable, accessible, interoperable, reusable. Database URL: http://edal.ipk-gatersleben.de/repos/pgp/ PMID:27087305

  7. NCBI GEO: archive for functional genomics data sets--10 years on.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Evangelista, Carlos; Kim, Irene F; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Holko, Michelle; Ayanbule, Oluwabukunmi; Yefanov, Andrey; Soboleva, Alexandra

    2011-01-01

    A decade ago, the Gene Expression Omnibus (GEO) database was established at the National Center for Biotechnology Information (NCBI). The original objective of GEO was to serve as a public repository for high-throughput gene expression data generated mostly by microarray technology. However, the research community quickly applied microarrays to non-gene-expression studies, including examination of genome copy number variation and genome-wide profiling of DNA-binding proteins. Because the GEO database was designed with a flexible structure, it was possible to quickly adapt the repository to store these data types. More recently, as the microarray community switches to next-generation sequencing technologies, GEO has again adapted to host these data sets. Today, GEO stores over 20,000 microarray- and sequence-based functional genomics studies, and continues to handle the majority of direct high-throughput data submissions from the research community. Multiple mechanisms are provided to help users effectively search, browse, download and visualize the data at the level of individual genes or entire studies. This paper describes recent database enhancements, including new search and data representation tools, as well as a brief review of how the community uses GEO data. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  8. Acute Kidney Injury and Big Data.

    PubMed

    Sutherland, Scott M; Goldstein, Stuart L; Bagshaw, Sean M

    2018-01-01

    The recognition of a standardized, consensus definition for acute kidney injury (AKI) has been an important milestone in critical care nephrology, which has facilitated innovation in prevention, quality of care, and outcomes research among the growing population of hospitalized patients susceptible to AKI. Concomitantly, there have been substantial advances in "big data" technologies in medicine, including electronic health records (EHR), data registries and repositories, and data management and analytic methodologies. EHRs are increasingly being adopted, clinical informatics is constantly being refined, and the field of EHR-enabled care improvement and research has grown exponentially. While these fields have matured independently, integrating the two has the potential to redefine and integrate AKI-related care and research. AKI is an ideal condition to exploit big data health care innovation for several reasons: AKI is common, increasingly encountered in hospitalized settings, imposes meaningful risk for adverse events and poor outcomes, has incremental cost implications, and has been plagued by suboptimal quality of care. In this concise review, we discuss the potential applications of big data technologies, particularly modern EHR platforms and health data repositories, to transform our capacity for AKI prediction, detection, and care quality. © 2018 S. Karger AG, Basel.

  9. TeleMed: An example of a new system developed with object technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.; Phillips, R.; Tomlinson, B.

    1996-12-01

    Los Alamos National Laboratory has developed a virtual patient record system called TeleMed which is based on a distributed national radiographic and patient record repository located throughout the country. Without leaving their offices, participating doctors can view clinical drug and radiographic data via a sophisticated multimedia interface. For example, a doctor can match a patient`s radiographic information with the data in the repository, review treatment history and success, and then determine the best treatment. Furthermore, the features of TeleMed that make it attractive to clinicians and diagnosticians make it valuable for teaching and presentation as well. Thus, a resident canmore » use TeleMed for self-training in diagnostic techniques and a physician can use it to explain to a patient the course of their illness. In fact, the data can be viewed simultaneously by users at two or more distant locations for consultation with specialists in different fields. This capability is of enormous value to a wide spectrum of healthcare providers. It is made possible by the integration of multimedia information using commercial CORBA technology linking object-enabled databases with client interfaces using a three-tiered architecture.« less

  10. MASPECTRAS: a platform for management and analysis of proteomics LC-MS/MS data

    PubMed Central

    Hartler, Jürgen; Thallinger, Gerhard G; Stocker, Gernot; Sturn, Alexander; Burkard, Thomas R; Körner, Erik; Rader, Robert; Schmidt, Andreas; Mechtler, Karl; Trajanoski, Zlatko

    2007-01-01

    Background The advancements of proteomics technologies have led to a rapid increase in the number, size and rate at which datasets are generated. Managing and extracting valuable information from such datasets requires the use of data management platforms and computational approaches. Results We have developed the MAss SPECTRometry Analysis System (MASPECTRAS), a platform for management and analysis of proteomics LC-MS/MS data. MASPECTRAS is based on the Proteome Experimental Data Repository (PEDRo) relational database schema and follows the guidelines of the Proteomics Standards Initiative (PSI). Analysis modules include: 1) import and parsing of the results from the search engines SEQUEST, Mascot, Spectrum Mill, X! Tandem, and OMSSA; 2) peptide validation, 3) clustering of proteins based on Markov Clustering and multiple alignments; and 4) quantification using the Automated Statistical Analysis of Protein Abundance Ratios algorithm (ASAPRatio). The system provides customizable data retrieval and visualization tools, as well as export to PRoteomics IDEntifications public repository (PRIDE). MASPECTRAS is freely available at Conclusion Given the unique features and the flexibility due to the use of standard software technology, our platform represents significant advance and could be of great interest to the proteomics community. PMID:17567892

  11. NCI Program for Natural Product Discovery: A Publicly-Accessible Library of Natural Product Fractions for High-Throughput Screening.

    PubMed

    Thornburg, Christopher C; Britt, John R; Evans, Jason R; Akee, Rhone K; Whitt, James A; Trinh, Spencer K; Harris, Matthew J; Thompson, Jerell R; Ewing, Teresa L; Shipley, Suzanne M; Grothaus, Paul G; Newman, David J; Schneider, Joel P; Grkovic, Tanja; O'Keefe, Barry R

    2018-06-13

    The US National Cancer Institute's (NCI) Natural Product Repository is one of the world's largest, most diverse collections of natural products containing over 230,000 unique extracts derived from plant, marine, and microbial organisms that have been collected from biodiverse regions throughout the world. Importantly, this national resource is available to the research community for the screening of extracts and the isolation of bioactive natural products. However, despite the success of natural products in drug discovery, compatibility issues that make extracts challenging for liquid handling systems, extended timelines that complicate natural product-based drug discovery efforts and the presence of pan-assay interfering compounds have reduced enthusiasm for the high-throughput screening (HTS) of crude natural product extract libraries in targeted assay systems. To address these limitations, the NCI Program for Natural Product Discovery (NPNPD), a newly launched, national program to advance natural product discovery technologies and facilitate the discovery of structurally defined, validated lead molecules ready for translation will create a prefractionated library from over 125,000 natural product extracts with the aim of producing a publicly-accessible, HTS-amenable library of >1,000,000 fractions. This library, representing perhaps the largest accumulation of natural-product based fractions in the world, will be made available free of charge in 384-well plates for screening against all disease states in an effort to reinvigorate natural product-based drug discovery.

  12. The Southeastern Minnesota Beacon Project for Community-driven Health Information Technology: Origins, Achievements, and Legacy.

    PubMed

    Chute, Christopher G; Hart, Lacey A; Alexander, Alex K; Jensen, Daniel W

    2014-01-01

    The Southeastern (SE) Minnesota Beacon organized all the health care providers, county public health organizations, and school districts in the deployment and integration of health information exchange (HIE) and targeted health communication around childhood asthma and diabetes. The community cooperated to establish a clinical data repository for all residents in the 11-county region. Through this community of practice approach that involved traditional and nontraditional providers, the SE Minnesota Beacon was able to realize unique applications of this technology. This manuscript overviews the associated organization and infrastructure of this community collaboration. The Office of the National Coordinator for Health Information Technology (ONC), as part of the American Recovery and Reinvestment Act of 2009 (ARRA) stimulus, established 17 projects throughout the United States targeting the introduction and meaningful use of health information technology (HIT). These 17 communities were intended to serve as an example of what could be accomplished. The SE Minnesota Beacon is one of these communities. The community ultimately opted for peer-to-peer HIE, using Nationwide Health Information Network (NwHIN) Connect software. The clinical data repository was established using the infrastructure developed by the Regenstrief Institute, which operated as a trusted third party. As an extension to HIE, the consortium of county public health departments created a patient data portal for use by school nurses and parents. Childhood asthma was addressed by creating, exchanging, and maintaining an "asthma action plan" for each affected child, shared throughout the community, including through the patient portal. Diabetes management introduced patient treatment decision tools and patient quality of life measures, facilitating care. Influenza vaccination was enhanced by large-scale community reporting in partnership with the state vaccination registry. The methodology and principles for arriving at these solutions included community engagement, sustainability, scalability, standards, and best practices that fit a variety of organizations-from large, robust providers to small organizations. The SE Minnesota Beacon demonstrated that all providers for a geographically defined population can cooperate in the development and shared governance of a low-cost, sustainable HIE, and the operation of a community-managed clinical data repository. Furthermore, these infrastructures can be leveraged to collaboratively improve the care of patients, as demonstrated for childhood asthma and adult diabetes mellitus. The shared governance of HIT by a community can palpably change the scope and success of collaborations targeted to improve patient and community health care.

  13. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  14. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  15. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  16. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  17. Smoothing Data Friction through building Service Oriented Data Platforms

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Richards, C. J.; Evans, B. J. K.; Wang, J.; Druken, K. A.

    2017-12-01

    Data Friction has been commonly defined as the costs in time, energy and attention required to simply collect, check, store, move, receive, and access data. On average, researchers spend a significant fraction of their time finding the data for their research project and then reformatting it so that it can be used by the software application of their choice. There is an increasing role for both data repositories and software to be modernised to help reduce data friction in ways that support the better use of the data. Many generic data repositories simply accept data in the format as supplied: the key check is that the data have sufficient metadata to enable discovery and download. Few generic repositories have both the expertise and infrastructure to support the multiple domain specific requirements that facilitate the increasing need for integration and reusability. In contrast, major science domain-focused repositories are increasingly able to implement and enforce community endorsed best practices and guidelines that ensure reusability and harmonization of data for use within the community by offering semi-automated QC workflows to improve quality of submitted data. The most advanced of these science repositories now operate as service-oriented data platforms that extend the use of data across domain silos and increasingly provide server-side programmatically-enabled access to data via network protocols and community standard APIs. To provide this, more rigorous QA/QC procedures are needed to validate data against standards and community software and tools. This ensures that the data can be accessed in expected ways and also demonstrates that the data works across different (non-domain specific) packages, tools and programming languages deployed by the various user communities. In Australia, the National Computational Infrastructure (NCI) has created such a service-oriented data platform which is demonstrating how this approach can reduce data friction, servicing both individual domains as well as facilitating cross-domain collaboration. The approach has required an increase in effort for the repository to provide the additional expertise, so as to enable a better capability and efficient system which ultimately saves time by the individual researcher.

  18. The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories

    NASA Astrophysics Data System (ADS)

    Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.

    2017-12-01

    SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R

  19. Opportunities and challenges provided by cloud repositories for bioinformatics-enabled drug discovery.

    PubMed

    Dalpé, Gratien; Joly, Yann

    2014-09-01

    Healthcare-related bioinformatics databases are increasingly offering the possibility to maintain, organize, and distribute DNA sequencing data. Different national and international institutions are currently hosting such databases that offer researchers website platforms where they can obtain sequencing data on which they can perform different types of analysis. Until recently, this process remained mostly one-dimensional, with most analysis concentrated on a limited amount of data. However, newer genome sequencing technology is producing a huge amount of data that current computer facilities are unable to handle. An alternative approach has been to start adopting cloud computing services for combining the information embedded in genomic and model system biology data, patient healthcare records, and clinical trials' data. In this new technological paradigm, researchers use virtual space and computing power from existing commercial or not-for-profit cloud service providers to access, store, and analyze data via different application programming interfaces. Cloud services are an alternative to the need of larger data storage; however, they raise different ethical, legal, and social issues. The purpose of this Commentary is to summarize how cloud computing can contribute to bioinformatics-based drug discovery and to highlight some of the outstanding legal, ethical, and social issues that are inherent in the use of cloud services. © 2014 Wiley Periodicals, Inc.

  20. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    PubMed

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  1. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    PubMed Central

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  2. 40 CFR 124.33 - Information repository.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Information repository. 124.33 Section... FOR DECISIONMAKING Specific Procedures Applicable to RCRA Permits § 124.33 Information repository. (a... basis, for an information repository. When assessing the need for an information repository, the...

  3. 10 CFR 60.130 - General considerations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORIES Technical Criteria Design Criteria for the Geologic Repository Operations Area § 60.130 General... for a high-level radioactive waste repository at a geologic repository operations area, and an... geologic repository operations area, must include the principal design criteria for a proposed facility...

  4. Annual Historical Summary, Defense Documentation Center, 1 July 1968 to 30 June 1969.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    This summary describes the more significant activities and achievements of the Defense Documentation Center (DDC) including: DDC and the scientific and technical community. The DDC role in the Department of Defense Scientific and Technical Information Program continued to shift from the traditional concept of an archival repository and a…

  5. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...

  6. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...

  7. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...

  8. 10 CFR 60.111 - Performance of the geologic repository operations area through permanent closure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... retrieval throughout the period during which wastes are being emplaced and, thereafter, until the completion of a preformance confirmation program and Commission review of the information obtained from such a... retrievability. (3) For purposes of this paragraph, a reasonable schedule for retrieval is one that would permit...

  9. 75 FR 66136 - Agency Information Collection Activities: Reinstatement, With Change, of a Previously Approved...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... Criminal History Information Systems. The Department of Justice (DOJ), Office of Justice Programs, Bureau... collection for which approval has expired. (2) Title of the Form/Collection: Survey of State Criminal History... history records and on the increasing number of operations and services provided by state repositories. (5...

  10. Harvesting Alternative Credit Transfer Students: Redefining Selectivity in Your Online Learning Program Enrollment Leads

    ERIC Educational Resources Information Center

    Corlett, Bradly

    2014-01-01

    Several recent issues and trends in online education have resulted in consolidation of efforts for Massive Open Online Courses (MOOCs), increased Open Educational Resources (OER) in the form of asynchronous course repositories, with noticeable increases in governance and policy amplification. These emerging enrollment trends in alternative online…

  11. Can shale safely host US nuclear waste?

    USGS Publications Warehouse

    Neuzil, C.E.

    2013-01-01

    "Even as cleanup efforts after Japan’s Fukushima disaster offer a stark reminder of the spent nuclear fuel (SNF) stored at nuclear plants worldwide, the decision in 2009 to scrap Yucca Mountain as a permanent disposal site has dimmed hope for a repository for SNF and other high-level nuclear waste (HLW) in the United States anytime soon. About 70,000 metric tons of SNF are now in pool or dry cask storage at 75 sites across the United States [Government Accountability Office, 2012], and uncertainty about its fate is hobbling future development of nuclear power, increasing costs for utilities, and creating a liability for American taxpayers [Blue Ribbon Commission on America’s Nuclear Future, 2012].However, abandoning Yucca Mountain could also result in broadening geologic options for hosting America’s nuclear waste. Shales and other argillaceous formations (mudrocks, clays, and similar clay-rich media) have been absent from the U.S. repository program. In contrast, France, Switzerland, and Belgium are now planning repositories in argillaceous formations after extensive research in underground laboratories on the safety and feasibility of such an approach [Blue Ribbon Commission on America’s Nuclear Future, 2012; Nationale Genossenschaft für die Lagerung radioaktiver Abfälle (NAGRA), 2010; Organisme national des déchets radioactifs et des matières fissiles enrichies, 2011]. Other nations, notably Japan, Canada, and the United Kingdom, are studying argillaceous formations or may consider them in their siting programs [Japan Atomic Energy Agency, 2012; Nuclear Waste Management Organization (NWMO), (2011a); Powell et al., 2010]."

  12. Pretest characterization of WIPP experimental waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J.; Davis, H.; Drez, P.E.

    The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is an underground repository designed for the storage and disposal of transuranic (TRU) wastes from US Department of Energy (DOE) facilities across the country. The Performance Assessment (PA) studies for WIPP address compliance of the repository with applicable regulations, and include full-scale experiments to be performed at the WIPP site. These experiments are the bin-scale and alcove tests to be conducted by Sandia National Laboratories (SNL). Prior to conducting these experiments, the waste to be used in these tests needs to be characterized to provide data on the initial conditionsmore » for these experiments. This characterization is referred to as the Pretest Characterization of WIPP Experimental Waste, and is also expected to provide input to other programmatic efforts related to waste characterization. The purpose of this paper is to describe the pretest waste characterization activities currently in progress for the WIPP bin-scale waste, and to discuss the program plan and specific analytical protocols being developed for this characterization. The relationship between different programs and documents related to waste characterization efforts is also highlighted in this paper.« less

  13. 48 CFR 227.7108 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Technical Data 227.7108 Contractor data repositories. (a) Contractor data repositories may be established... procedures for protecting technical data delivered to or stored at the repository from unauthorized release... disclosure of technical data from the repository to third parties consistent with the Government's rights in...

  14. Waste isolation safety assessment program. Task 4. Third contractor information meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-06-01

    The Contractor Information Meeting (October 14 to 17, 1979) was part of the FY-1979 effort of Task 4 of the Waste Isolation Safety Assessment Program (WISAP): Sorption/Desorption Analysis. The objectives of this task are to: evaluate sorption/desorption measurement methods and develop a standardized measurement procedure; produce a generic data bank of nuclide-geologic interactions using a wide variety of geologic media and groundwaters; perform statistical analysis and synthesis of these data; perform validation studies to compare short-term laboratory studies to long-term in situ behavior; develop a fundamental understanding of sorption/desorption processes; produce x-ray and gamma-emitting isotopes suitable for the study ofmore » actinides at tracer concentrations; disseminate resulting information to the international technical community; and provide input data support for repository safety assessment. Conference participants included those subcontracted to WISAP Task 4, representatives and independent subcontractors to the Office of Nuclear Waste Isolation, representatives from other waste disposal programs, and experts in the area of waste/geologic media interaction. Since the meeting, WISAP has been divided into two programs: Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) (modeling efforts) and Waste/Rock Interactions Technology (WRIT) (experimental work). The WRIT program encompasses the work conducted under Task 4. This report contains the information presented at the Task 4, Third Contractor Information Meeting. Technical Reports from the subcontractors, as well as Pacific Northwest Laboratory (PNL), are provided along with transcripts of the question-and-answer sessions. The agenda and abstracts of the presentations are also included. Appendix A is a list of the participants. Appendix B gives an overview of the WRIT program and details the WRIT work breakdown structure for 1980.« less

  15. Tourism impacts of Three Mile Island and other adverse events: Implications for Lincoln County and other rural counties bisected by radioactive wastes intended for Yucca Mountain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Himmelberger, J.J.; Ogneva-Himmelberger, Y.A.; Baughman, M.

    Whether the proposed Yucca Mountain nuclear waste repository system will adversely impact tourism in southern Nevada is an open question of particular importance to visitor-oriented rural counties bisected by planned waste transportation corridors (highway or rail). As part of one such county`s repository impact assessment program, tourism implications of Three Mile Island (TMI) and other major hazard events have been revisited to inform ongoing county-wide socioeconomic assessments and contingency planning efforts. This paper summarizes key research implications of such research as applied to Lincoln County, Nevada. Implications for other rural counties are discussed in light of the research findings. 29more » refs., 3 figs., 1 tab.« less

  16. A clinical data repository enhances hospital infection control.

    PubMed Central

    Samore, M.; Lichtenberg, D.; Saubermann, L.; Kawachi, C.; Carmeli, Y.

    1997-01-01

    We describe the benefits of a relational database of hospital clinical data (Clinical Data Repository; CDR) for an infection control program. The CDR consists of > 40 Sybase tables, and is directly accessible for ad hoc queries by members of the infection control unit who have been granted privileges for access by the Information Systems Department. The data elements and functional requirements most useful for surveillance of nosocomial infections, antibiotic use, and resistant organisms are characterized. Specific applications of the CDR are presented, including the use of automated definitions of nosocomial infection, graphical monitoring of resistant organisms with quality control limits, and prospective detection of inappropriate antibiotic use. Hospital surveillance and quality improvement activities are significantly benefited by the availability of a querable set of tables containing diverse clinical data. PMID:9357588

  17. Microstructural and mineralogical characterization of selected shales in support of nuclear waste repository studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S.Y.; Hyder, L.K.; Alley, P.D.

    1988-01-01

    Five shales were examined as part of the Sedimentary Rock Program evaluation of this medium as a potential host for a US civilian nuclear waste repository. The units selected for characterization were the Chattanooga Shale from Fentress County, Tennessee; the Pierre Shale from Gregory County, South Dakota; the Green River Formation from Garfield County, Colorado; and the Nolichucky Shale and Pumpkin Valley Shale from Roane County, Tennessee. The micromorphology and structure of the shales were examined by petrographic, scanning electron, and high-resolution transmission electron microscopy. Chemical and mineralogical compositions were studied through the use of energy-dispersive x-ray, neutron activation, atomicmore » absorption, thermal, and x-ray diffraction analysis techniques. 18 refs., 12 figs., 2 tabs.« less

  18. Cryopreservation in fish: current status and pathways to quality assurance and quality control in repository development

    PubMed Central

    Torres, Leticia; Hu, E.; Tiersch, Terrence R.

    2017-01-01

    Cryopreservation in aquatic species in general has been constrained to research activities for more than 60 years. Although the need for application and commercialisation pathways has become clear, the lack of comprehensive quality assurance and quality control programs has impeded the progress of the field, delaying the establishment of germplasm repositories and commercial-scale applications. In this review we focus on the opportunities for standardisation in the practices involved in the four main stages of the cryopreservation process: (1) source, housing and conditioning of fish; (2) sample collection and preparation; (3) freezing and cryogenic storage of samples; and (4) egg collection and use of thawed sperm samples. In addition, we introduce some key factors that would assist the transition to commercial-scale, high-throughput application. PMID:26739583

  19. Information Warehouse – A Comprehensive Informatics Platform for Business, Clinical, and Research Applications

    PubMed Central

    Kamal, Jyoti; Liu, Jianhua; Ostrander, Michael; Santangelo, Jennifer; Dyta, Ravi; Rogers, Patrick; Mekhjian, Hagop S.

    2010-01-01

    Since its inception in 1997, the IW (Information Warehouse) at the Ohio State University Medical Center (OSUMC) has gradually transformed itself from a single purpose business decision support system to a comprehensive informatics platform supporting basic, clinical, and translational research. The IW today is the combination of four integrated components: a clinical data repository containing over a million patients; a research data repository housing various research specific data; an application development platform for building business and research enabling applications; a business intelligence environment assisting in reporting in all function areas. The IW is structured and encoded using standard terminologies such as SNOMED-CT, ICD, and CPT. The IW is an important component of OSUMC’s Clinical and Translational Science Award (CTSA) informatics program. PMID:21347019

  20. Variable thickness transient ground-water flow model. Volume 3. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less

  1. Technology for NPP decantate treatment realized at Kola NPP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stakhiv, Michael; Avezniyazov, Slava; Savkin, Alexander

    2007-07-01

    At Moscow SIA 'Radon' jointly with JSC 'Alliance Gamma', the technology for NPP Decantate Treatment was developed, tested and realized at Kola NPP. This technology consists of dissolving the salt residue and subsequent treatment by ozonization, separation of the deposits formed from ozonization and selective cleaning by ferro-cyanide sorbents. The nonactive salt solution goes to an industrial waste disposal site or a repository specially developed at NPP sites for 'exempt waste' products by IAEA classification. This technology was realized at Kola NPP in December 2006 year. At this time more than 1000 m{sup 3} of decantates log time stored aremore » treated. It allows solving very old problem to empty decantates' tanks at NPPs in environmentally safe manner and with high volume reduction factor. (authors)« less

  2. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    PubMed Central

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments. PMID:23984425

  3. A cloud-based information repository for bridge monitoring applications

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2016-04-01

    This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.

  4. Semantic Web repositories for genomics data using the eXframe platform.

    PubMed

    Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna

    2014-01-01

    With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.

  5. Designing for Change: Interoperability in a scaling and adapting environment

    NASA Astrophysics Data System (ADS)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  6. A proposed concept for a crustal dynamics information management network

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.; Renfrow, J. T.

    1980-01-01

    The findings of a requirements and feasibility analysis of the present and potential producers, users, and repositories of space-derived geodetic information are summarized. A proposed concept is presented for a crustal dynamics information management network that would apply state of the art concepts of information management technology to meet the expanding needs of the producers, users, and archivists of this geodetic information.

  7. Cryptanalysis of the Sodark Family of Cipher Algorithms

    DTIC Science & Technology

    2017-09-01

    software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a

  8. Biomedical Data Sharing and Reuse: Attitudes and Practices of Clinical and Scientific Research Staff.

    PubMed

    Federer, Lisa M; Lu, Ya-Ling; Joubert, Douglas J; Welsh, Judith; Brandys, Barbara

    2015-01-01

    Significant efforts are underway within the biomedical research community to encourage sharing and reuse of research data in order to enhance research reproducibility and enable scientific discovery. While some technological challenges do exist, many of the barriers to sharing and reuse are social in nature, arising from researchers' concerns about and attitudes toward sharing their data. In addition, clinical and basic science researchers face their own unique sets of challenges to sharing data within their communities. This study investigates these differences in experiences with and perceptions about sharing data, as well as barriers to sharing among clinical and basic science researchers. Clinical and basic science researchers in the Intramural Research Program at the National Institutes of Health were surveyed about their attitudes toward and experiences with sharing and reusing research data. Of 190 respondents to the survey, the 135 respondents who identified themselves as clinical or basic science researchers were included in this analysis. Odds ratio and Fisher's exact tests were the primary methods to examine potential relationships between variables. Worst-case scenario sensitivity tests were conducted when necessary. While most respondents considered data sharing and reuse important to their work, they generally rated their expertise as low. Sharing data directly with other researchers was common, but most respondents did not have experience with uploading data to a repository. A number of significant differences exist between the attitudes and practices of clinical and basic science researchers, including their motivations for sharing, their reasons for not sharing, and the amount of work required to prepare their data. Even within the scope of biomedical research, addressing the unique concerns of diverse research communities is important to encouraging researchers to share and reuse data. Efforts at promoting data sharing and reuse should be aimed at solving not only technological problems, but also addressing researchers' concerns about sharing their data. Given the varied practices of individual researchers and research communities, standardizing data practices like data citation and repository upload could make sharing and reuse easier.

  9. Preservation Environments

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    2004-01-01

    The long-term preservation of digital entities requires mechanisms to manage the authenticity of massive data collections that are written to archival storage systems. Preservation environments impose authenticity constraints and manage the evolution of the storage system technology by building infrastructure independent solutions. This seeming paradox, the need for large archives, while avoiding dependence upon vendor specific solutions, is resolved through use of data grid technology. Data grids provide the storage repository abstractions that make it possible to migrate collections between vendor specific products, while ensuring the authenticity of the archived data. Data grids provide the software infrastructure that interfaces vendor-specific storage archives to preservation environments.

  10. Improving accessibility to mathematical formulas: the Wikipedia Math Accessor

    NASA Astrophysics Data System (ADS)

    Fuentes Sepúlveda, J.; Ferres, L.

    2012-09-01

    Mathematics accessibility is an important topic for inclusive education. In this paper, we make Wikipedia's repository of mathematical formulas accessible by providing a natural language description of its more than 420,000 formulas using a well-researched sub-language. We also contribute by targeting Spanish speakers, for whom assistive technologies, particularly domain-specific technologies like the one described here, are scarce. Our focus on the semantics of formulas (rather than their visual appearance) allowed us to generate verbalizations with a precision of approximately 80% of understandable descriptions, as shown in an evaluation with sighted users.

  11. Recommendation of standardized health learning contents using archetypes and semantic web technologies.

    PubMed

    Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2012-01-01

    Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.

  12. Digital data preservation for scholarly publications in astronomy

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayeed; di Lauro, Tim; Szalay, Alex; Vishniac, Ethan; Hanisch, Robert; Steffen, Julie; Milkey, Robert; Ehling, Teresa; Plante, Ray

    2007-11-01

    Astronomy is similar to other scientific disciplines in that scholarly publication relies on the presentation and interpretation of data. But although astronomy now has archives for its primary research telescopes and associated surveys, the highly processed data that is presented in the peer-reviewed journals and is the basis for final analysis and interpretation is generally not archived and has no permanent repository. We have initiated a project whose goal is to implement an end-to-end prototype system which, through a partnership of a professional society, that society's scholarly publications/publishers, research libraries, and an information technology substrate provided by the Virtual Observatory, will capture high-level digital data as part of the publication process and establish a distributed network of curated, permanent data repositories. The data in this network will be accessible through the research journals, astronomy data centers, and Virtual Observatory data discovery portals.

  13. The Geothermal Data Repository: Five Years of Open Geothermal Data, Benefits to the Community: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weers, Jonathan D; Taverna, Nicole; Anderson, Arlene

    In the five years since its inception, the Department of Energy's (DOE) Geothermal Data Repository (GDR) has grown from the simple idea of storing public data in a centralized location to a valuable tool at the center of the DOE open data movement where it is providing a tangible benefit to the geothermal scientific community. Throughout this time, the GDR project team has been working closely with the community to refine the data submission process, improve the quality of submitted data, and embrace modern proper data management strategies to maximize the value and utility of submitted data. This paper exploresmore » some of the motivations behind various improvements to the GDR over the last 5 years, changes in data submission trends, and the ways in which these improvements have helped to drive research, fuel innovation, and accelerate the adoption of geothermal technologies.« less

  14. A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.

    PubMed

    Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos

    2016-01-01

    Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.

  15. A RESTful image gateway for multiple medical image repositories.

    PubMed

    Valente, Frederico; Viana-Ferreira, Carlos; Costa, Carlos; Oliveira, José Luis

    2012-05-01

    Mobile technologies are increasingly important components in telemedicine systems and are becoming powerful decision support tools. Universal access to data may already be achieved by resorting to the latest generation of tablet devices and smartphones. However, the protocols employed for communicating with image repositories are not suited to exchange data with mobile devices. In this paper, we present an extensible approach to solving the problem of querying and delivering data in a format that is suitable for the bandwidth and graphic capacities of mobile devices. We describe a three-tiered component-based gateway that acts as an intermediary between medical applications and a number of Picture Archiving and Communication Systems (PACS). The interface with the gateway is accomplished using Hypertext Transfer Protocol (HTTP) requests following a Representational State Transfer (REST) methodology, which relieves developers from dealing with complex medical imaging protocols and allows the processing of data on the server side.

  16. The Geothermal Data Repository: Five Years of Open Geothermal Data, Benefits to the Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weers, Jonathan D; Taverna, Nicole; Anderson, Arlene

    In the five years since its inception, the Department of Energy's (DOE) Geothermal Data Repository (GDR) has grown from the simple idea of storing public data in a centralized location to a valuable tool at the center of the DOE open data movement where it is providing a tangible benefit to the geothermal scientific community. Throughout this time, the GDR project team has been working closely with the community to refine the data submission process, improve the quality of submitted data, and embrace modern proper data management strategies to maximize the value and utility of submitted data. This paper exploresmore » some of the motivations behind various improvements to the GDR over the last 5 years, changes in data submission trends, and the ways in which these improvements have helped to drive research, fuel innovation, and accelerate the adoption of geothermal technologies.« less

  17. The development of digital library system for drug research information.

    PubMed

    Kim, H J; Kim, S R; Yoo, D S; Lee, S H; Suh, O K; Cho, J H; Shin, H T; Yoon, J P

    1998-01-01

    The sophistication of computer technology and information transmission on internet has made various cyber information repository available to information consumers. In the era of information super-highway, the digital library which can be accessed from remote sites at any time is considered the prototype of information repository. Using object-oriented DBMS, the very first model of digital library for pharmaceutical researchers and related professionals in Korea has been developed. The published research papers and researchers' personal information was included in the database. For database with research papers, 13 domestic journals were abstracted and scanned for full-text image files which can be viewed by Internet web browsers. The database with researchers' personal information was also developed and interlinked to the database with research papers. These database will be continuously updated and will be combined with world-wide information as the unique digital library in the field of pharmacy.

  18. The RAND Online Measure Repository for Evaluating Psychological Health and Traumatic Brain Injury Programs. The RAND Toolkit, Volume 2

    DTIC Science & Technology

    2014-01-01

    tempo may raise the risk for mental health challenges. During this time, the U.S. Department of Defense (DoD) has implemented numerous programs to...and were based on the constraints of each electronic database. However, most searches were variations on a basic three-category format: The first...Gerontology, 1983, 38: 111–116. Iannuzzo RW, Jaeger J, Goldberg JF, Kafantaris V, Sublette ME. “Development and Reliability of the Ham-D/MADRS

  19. Site characterization report for the basalt waste isolation project. Volume II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1982-11-01

    The reference location for a repository in basalt for the terminal storage of nuclear wastes on the Hanford Site and the candidate horizons within this reference repository location have been identified and the preliminary characterization work in support of the site screening process has been completed. Fifteen technical questions regarding the qualification of the site were identified to be addressed during the detailed site characterization phase of the US Department of Energy-National Waste Terminal Storage Program site selection process. Resolution of these questions will be provided in the final site characterization progress report, currently planned to be issued in 1987,more » and in the safety analysis report to be submitted with the License Application. The additional information needed to resolve these questions and the plans for obtaining the information have been identified. This Site Characterization Report documents the results of the site screening process, the preliminary site characterization data, the technical issues that need to be addressed, and the plans for resolving these issues. Volume 2 contains chapters 6 through 12: geochemistry; surface hydrology; climatology, meteorology, and air quality; environmental, land-use, and socioeconomic characteristics; repository design; waste package; and performance assessment.« less

  20. Development of a Korean reference HLW disposal system under the Korean representative geologic conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Heui-Joo; Lee, Jong Youl; Choi, Jongwon

    2007-07-01

    The development of a Korean Reference disposal System for the spent fuels from PWR and CANDU reactors is outlined in this paper. Around 36,000 tU of spent fuels are being projected based on the lifetimes of 28 nuclear power reactors in Korea. Since the site for the geological disposal has not yet been decided, a hypothetical site with representative Korean geologic conditions is proposed for the conceptual design of the repository. The disposal rates of the spent fuels are determined according to the total operation time of 55 years. The canisters are optimized by considering natural Korean conditions, and themore » buffer is designed with domestic Ca-bentonite. The depth of the repository is determined to be 500 m below the ground's surface. The canister separation distances are determined through a thermal analysis. The main features of the repository are presented from the layout to the closure. A computer program has been developed to calculate and analyze the volume and the area of the disposal system to help in the cost analysis. The final output of the design is presented as a unit disposal cost, US $315 /kgU. (authors)« less

  1. Wisconsin’s Environmental Public Health Tracking Network: Information Systems Design for Childhood Cancer Surveillance

    PubMed Central

    Hanrahan, Lawrence P.; Anderson, Henry A.; Busby, Brian; Bekkedal, Marni; Sieger, Thomas; Stephenson, Laura; Knobeloch, Lynda; Werner, Mark; Imm, Pamela; Olson, Joseph

    2004-01-01

    In this article we describe the development of an information system for environmental childhood cancer surveillance. The Wisconsin Cancer Registry annually receives more than 25,000 incident case reports. Approximately 269 cases per year involve children. Over time, there has been considerable community interest in understanding the role the environment plays as a cause of these cancer cases. Wisconsin’s Public Health Information Network (WI-PHIN) is a robust web portal integrating both Health Alert Network and National Electronic Disease Surveillance System components. WI-PHIN is the information technology platform for all public health surveillance programs. Functions include the secure, automated exchange of cancer case data between public health–based and hospital-based cancer registrars; web-based supplemental data entry for environmental exposure confirmation and hypothesis testing; automated data analysis, visualization, and exposure–outcome record linkage; directories of public health and clinical personnel for role-based access control of sensitive surveillance information; public health information dissemination and alerting; and information technology security and critical infrastructure protection. For hypothesis generation, cancer case data are sent electronically to WI-PHIN and populate the integrated data repository. Environmental data are linked and the exposure–disease relationships are explored using statistical tools for ecologic exposure risk assessment. For hypothesis testing, case–control interviews collect exposure histories, including parental employment and residential histories. This information technology approach can thus serve as the basis for building a comprehensive system to assess environmental cancer etiology. PMID:15471739

  2. Genome Maps, a new generation genome browser.

    PubMed

    Medina, Ignacio; Salavert, Francisco; Sanchez, Rubén; de Maria, Alejandro; Alonso, Roberto; Escobar, Pablo; Bleda, Marta; Dopazo, Joaquín

    2013-07-01

    Genome browsers have gained importance as more genomes and related genomic information become available. However, the increase of information brought about by new generation sequencing technologies is, at the same time, causing a subtle but continuous decrease in the efficiency of conventional genome browsers. Here, we present Genome Maps, a genome browser that implements an innovative model of data transfer and management. The program uses highly efficient technologies from the new HTML5 standard, such as scalable vector graphics, that optimize workloads at both server and client sides and ensure future scalability. Thus, data management and representation are entirely carried out by the browser, without the need of any Java Applet, Flash or other plug-in technology installation. Relevant biological data on genes, transcripts, exons, regulatory features, single-nucleotide polymorphisms, karyotype and so forth, are imported from web services and are available as tracks. In addition, several DAS servers are already included in Genome Maps. As a novelty, this web-based genome browser allows the local upload of huge genomic data files (e.g. VCF or BAM) that can be dynamically visualized in real time at the client side, thus facilitating the management of medical data affected by privacy restrictions. Finally, Genome Maps can easily be integrated in any web application by including only a few lines of code. Genome Maps is an open source collaborative initiative available in the GitHub repository (https://github.com/compbio-bigdata-viz/genome-maps). Genome Maps is available at: http://www.genomemaps.org.

  3. Genome Maps, a new generation genome browser

    PubMed Central

    Medina, Ignacio; Salavert, Francisco; Sanchez, Rubén; de Maria, Alejandro; Alonso, Roberto; Escobar, Pablo; Bleda, Marta; Dopazo, Joaquín

    2013-01-01

    Genome browsers have gained importance as more genomes and related genomic information become available. However, the increase of information brought about by new generation sequencing technologies is, at the same time, causing a subtle but continuous decrease in the efficiency of conventional genome browsers. Here, we present Genome Maps, a genome browser that implements an innovative model of data transfer and management. The program uses highly efficient technologies from the new HTML5 standard, such as scalable vector graphics, that optimize workloads at both server and client sides and ensure future scalability. Thus, data management and representation are entirely carried out by the browser, without the need of any Java Applet, Flash or other plug-in technology installation. Relevant biological data on genes, transcripts, exons, regulatory features, single-nucleotide polymorphisms, karyotype and so forth, are imported from web services and are available as tracks. In addition, several DAS servers are already included in Genome Maps. As a novelty, this web-based genome browser allows the local upload of huge genomic data files (e.g. VCF or BAM) that can be dynamically visualized in real time at the client side, thus facilitating the management of medical data affected by privacy restrictions. Finally, Genome Maps can easily be integrated in any web application by including only a few lines of code. Genome Maps is an open source collaborative initiative available in the GitHub repository (https://github.com/compbio-bigdata-viz/genome-maps). Genome Maps is available at: http://www.genomemaps.org. PMID:23748955

  4. IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2003-10-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has been on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2003, including the discontinuation of the use of Java in future Microsoft operating systems. The recent donation of BPAmoco's Houston core facility to the Texas Bureau of Economic Geology has provided substantial short-term relief of the space constraints for public repository space.« less

  5. Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training.

    PubMed

    Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A; Karim, Naz; Merck, Derek L

    2018-01-01

    Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients' de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based "blind insertion" invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner's AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices.

  6. Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training

    PubMed Central

    Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A.; Karim, Naz; Merck, Derek L.

    2018-01-01

    Introduction Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Methods Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. Results The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients’ de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based “blind insertion” invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner’s AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Conclusion Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices. PMID:29383074

  7. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  8. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Swap data repository... COMMISSION SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of part...

  9. 17 CFR 49.12 - Swap data repository recordkeeping requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 2 2014-04-01 2014-04-01 false Swap data repository... COMMISSION (CONTINUED) SWAP DATA REPOSITORIES § 49.12 Swap data repository recordkeeping requirements. (a) A registered swap data repository shall maintain its books and records in accordance with the requirements of...

  10. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... geologic repository operations area. 63.112 Section 63.112 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Technical... repository operations area. The preclosure safety analysis of the geologic repository operations area must...

  11. Managing and Evaluating Digital Repositories

    ERIC Educational Resources Information Center

    Zuccala, Alesia; Oppenheim, Charles; Dhiensa, Rajveen

    2008-01-01

    Introduction: We examine the role of the digital repository manager, discuss the future of repository management and evaluation and suggest that library and information science schools develop new repository management curricula. Method: Face-to-face interviews were carried out with managers of five different types of repositories and a Web-based…

  12. jPOSTrepo: an international standard data repository for proteomes

    PubMed Central

    Okuda, Shujiro; Watanabe, Yu; Moriya, Yuki; Kawano, Shin; Yamamoto, Tadashi; Matsumoto, Masaki; Takami, Tomoyo; Kobayashi, Daiki; Araki, Norie; Yoshizawa, Akiyasu C.; Tabata, Tsuyoshi; Sugiyama, Naoyuki; Goto, Susumu; Ishihama, Yasushi

    2017-01-01

    Major advancements have recently been made in mass spectrometry-based proteomics, yielding an increasing number of datasets from various proteomics projects worldwide. In order to facilitate the sharing and reuse of promising datasets, it is important to construct appropriate, high-quality public data repositories. jPOSTrepo (https://repository.jpostdb.org/) has successfully implemented several unique features, including high-speed file uploading, flexible file management and easy-to-use interfaces. This repository has been launched as a public repository containing various proteomic datasets and is available for researchers worldwide. In addition, our repository has joined the ProteomeXchange consortium, which includes the most popular public repositories such as PRIDE in Europe for MS/MS datasets and PASSEL for SRM datasets in the USA. Later MassIVE was introduced in the USA and accepted into the ProteomeXchange, as was our repository in July 2016, providing important datasets from Asia/Oceania. Accordingly, this repository thus contributes to a global alliance to share and store all datasets from a wide variety of proteomics experiments. Thus, the repository is expected to become a major repository, particularly for data collected in the Asia/Oceania region. PMID:27899654

  13. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  14. The Wind and Beyond: A Documentary Journey into the History of Aerodynamics in America. Volume 1; The Ascent of the Airplane

    NASA Technical Reports Server (NTRS)

    Hansen, James R. (Editor); Taylor, D. Bryan; Kinney, Jeremy; Lee, J. Lawrence

    2003-01-01

    This first volume, plus the succeeding five now in preparation, covers the impact of aerodynamic development on the evolution of the airplane in America. As the six-volume series will ultimately demonstrate, just as the airplane is a defining technology of the twentieth century, aerodynamics has been the defining element of the airplane. Volumes two through six will proceed in roughly chronological order, covering such developments as the biplane, the advent of commercial airliners, flying boats, rotary aircraft, supersonic flight, and hypersonic flight. This series is designed as an aeronautics companion to the Exploring the Unknown: Selected Documents in the History of the U.S. Civil Space Program (NASA SP-4407) series of books. As with Exploring the Unknown, the documents collected during this research project were assembled from a diverse number of public and private sources. A major repository of primary source materials relative to the history of the civil space program is the NASA Historical Reference Collection in the NASA Headquarters History Office. Historical materials housed at NASA field centers, academic institutions, and Presidential libraries were other sources of documents considered for inclusion, as were papers in the archives of private individuals and corporations.

  15. Performance Confirmation Data Aquisition System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.W. Markman

    2000-10-27

    The purpose of this analysis is to identify and analyze concepts for the acquisition of data in support of the Performance Confirmation (PC) program at the potential subsurface nuclear waste repository at Yucca Mountain. The scope and primary objectives of this analysis are to: (1) Review the criteria for design as presented in the Performance Confirmation Data Acquisition/Monitoring System Description Document, by way of the Input Transmittal, Performance Confirmation Input Criteria (CRWMS M&O 1999c). (2) Identify and describe existing and potential new trends in data acquisition system software and hardware that would support the PC plan. The data acquisition softwaremore » and hardware will support the field instruments and equipment that will be installed for the observation and perimeter drift borehole monitoring, and in-situ monitoring within the emplacement drifts. The exhaust air monitoring requirements will be supported by a data communication network interface with the ventilation monitoring system database. (3) Identify the concepts and features that a data acquisition system should have in order to support the PC process and its activities. (4) Based on PC monitoring needs and available technologies, further develop concepts of a potential data acquisition system network in support of the PC program and the Site Recommendation and License Application.« less

  16. Where Will All Your Samples Go?

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to define standards that institutions must comply with to function as a trustworthy sample repository similar to trustworthy digital repositories. The iSamples Research Coordination Network of the EarthCube program aims to address some of these questions in workshops planned for 2018. This panel session offers an opportunity to ignite the discussion.

  17. Towards a distributed infrastructure for research drilling in Europe

    NASA Astrophysics Data System (ADS)

    Mevel, C.; Gatliff, R.; Ludden, J.; Camoin, G.; Horsfield, B.; Kopf, A.

    2012-04-01

    The EC-funded project "Deep Sea and Sub-Seafloor Frontier" (DS3F) aims at developing seafloor and sub seafloor sampling strategies for enhanced understanding of deep-sea and sub seafloor processes by connecting marine research in life and geosciences, climate and environmental change, with socio-economic issues and policy building. DS3F has identified access to sub seafloor sampling and instrumentation as a key element of this approach. There is a strong expertise in Europe concerning direct access to the sub seafloor. Within the international program IODP (Integrated Ocean Drilling Program), ECORD (European Consortium for Ocean Research Drilling) has successfully developed the concept of mission specific platforms (MSPs), contracted on a project basis to drill in ice covered and shallow water areas. The ECORD Science Operator, lead by the British Geological Survey (BGS) has build a internationally recognized expertise in scientific ocean drilling, from coring in challenging environment, through down hole measurements and laboratory analysis to core curation and data management. MARUM, at the Bremen University in Germany, is one of the three IODP core repositories. Europe is also at the forefront of scientific seabed drills, with the MeBo developed by MARUM as well as the BGS seabed rocks drills. Europe also plays a important role in continental scientific drilling and the European component of ICDP (International Continental Scientific Drilling Program) is strengthening, with the recent addition of France and foreseen addition of UK. Oceanic and continental drilling have very similar scientific objectives. Moreover, they share not only common technologies, but also common data handling systems. To develop an integrated approach to technology development and usage, a move towards a a distributed infrastructure for research drilling in Europe has been initiated by these different groups. Built on existing research & operational groups across Europe, it will facilitate the sharing of technological and scientific expertise for the benefit of the science community. It will link with other relevant infrastructure initiatives such as EMSO (European Marine Seafloor Observatories). It will raise the profile of scientific drilling in Europe and hopefully lead to better funding opportunities.

  18. Ontology Design Patterns: Bridging the Gap Between Local Semantic Use Cases and Large-Scale, Long-Term Data Integration

    NASA Astrophysics Data System (ADS)

    Shepherd, Adam; Arko, Robert; Krisnadhi, Adila; Hitzler, Pascal; Janowicz, Krzysztof; Chandler, Cyndy; Narock, Tom; Cheatham, Michelle; Schildhauer, Mark; Jones, Matt; Raymond, Lisa; Mickle, Audrey; Finin, Tim; Fils, Doug; Carbotte, Suzanne; Lehnert, Kerstin

    2015-04-01

    Integrating datasets for new use cases is one of the common drivers for adopting semantic web technologies. Even though linked data principles enables this type of activity over time, the task of reconciling new ontological commitments for newer use cases can be daunting. This situation was faced by the Biological and Chemical Oceanography Data Management Office (BCO-DMO) as it sought to integrate its existing linked data with other data repositories to address newer scientific use cases as a partner in the GeoLink Project. To achieve a successful integration with other GeoLink partners, BCO-DMO's metadata would need to be described using the new ontologies developed by the GeoLink partners - a situation that could impact semantic inferencing, pre-existing software and external users of BCO-DMO's linked data. This presentation describes the process of how GeoLink is bridging the gap between local, pre-existing ontologies to achieve scientific metadata integration for all its partners through the use of ontology design patterns. GeoLink, an NSF EarthCube Building Block, brings together experts from the geosciences, computer science, and library science in an effort to improve discovery and reuse of data and knowledge. Its participating repositories include content from field expeditions, laboratory analyses, journal publications, conference presentations, theses/reports, and funding awards that span scientific studies from marine geology to marine ecology and biogeochemistry to paleoclimatology. GeoLink's outcomes include a set of reusable ontology design patterns (ODPs) that describe core geoscience concepts, a network of Linked Data published by participating repositories using those ODPs, and tools to facilitate discovery of related content in multiple repositories.

  19. A strategy to seal exploratory boreholes in unsaturated tuff; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez, J.A.; Case, J.B.; Givens, C.A.

    1994-04-01

    This report presents a strategy for sealing exploratory boreholes associated with the Yucca Mountain Site Characterization Project. Over 500 existing and proposed boreholes have been considered in the development of this strategy, ranging from shallow (penetrating into alluvium only) to deep (penetrating into the groundwater table). Among the comprehensive list of recommendations are the following: Those boreholes within the potential repository boundary and penetrating through the potential repository horizon are the most significant boreholes from a performance standpoint and should be sealed. Shallow boreholes are comparatively insignificant and require only nominal sealing. The primary areas in which to place sealsmore » are away from high-temperature zones at a distance from the potential repository horizon in the Paintbrush nonwelded tuff and the upper portion of the Topopah Spring Member and in the tuffaceous beds of the Calico Hills Unit. Seals should be placed prior to waste emplacement. Performance goals for borehole seals both above and below the potential repository are proposed. Detailed construction information on the boreholes that could be used for future design specifications is provided along with a description of the environmental setting, i.e., the geology, hydrology, and the in situ and thermal stress states. A borehole classification scheme based on the condition of the borehole wall in different tuffaceous units is also proposed. In addition, calculations are presented to assess the significance of the boreholes acting as preferential pathways for the release of radionuclides. Design calculations are presented to answer the concerns of when, where, and how to seal. As part of the strategy development, available technologies to seal exploratory boreholes (including casing removal, borehole wall reconditioning, and seal emplacement) are reviewed.« less

  20. Glasses for immobilization of low- and intermediate-level radioactive waste

    NASA Astrophysics Data System (ADS)

    Laverov, N. P.; Omel'yanenko, B. I.; Yudintsev, S. V.; Stefanovsky, S. V.; Nikonov, B. S.

    2013-03-01

    Reprocessing of spent nuclear fuel (SNF) for recovery of fissionable elements is a precondition of long-term development of nuclear energetics. Solution of this problem is hindered by the production of a great amount of liquid waste; 99% of its volume is low- and intermediate-level radioactive waste (LILW). The volume of high-level radioactive waste (HLW), which is characterized by high heat release, does not exceed a fraction of a percent. Solubility of glasses at an elevated temperature makes them unfit for immobilization of HLW, the insulation of which is ensured only by mineral-like matrices. At the same time, glasses are a perfect matrix for LILW, which are distinguished by low heat release. The solubility of borosilicate glass at a low temperature is so low that even a glass with relatively low resistance enables them to retain safety of under-ground LILW depositories without additional engineering barriers. The optimal technology of liquid confinement is their concentration and immobilization in borosilicate glasses, which are disposed in shallow-seated geological repositories. The vitrification of 1 m3 liquid LILW with a salt concentration of ˜300 kg/m3 leaves behind only 0.2 m3 waste, that is, 4-6 times less than by bitumen impregnation and 10 times less than by cementation. Environmental and economic advantages of LILW vitrification result from (1) low solubility of the vitrified LILW in natural water; (2) significant reduction of LILW volume; (3) possibility to dispose the vitrified waste without additional engineering barriers under shallow conditions and in diverse geological media; (4) the strength of glass makes its transportation and storage possible; and finally (5) reliable longterm safety of repositories. When the composition of the glass matrix for LILW is being chosen, attention should be paid to the factors that ensure high technological and economic efficiency of vitrification. The study of vitrified LILW from the Kursk nuclear power plant with high-power channel reactors (HPCR; equivalent Russian acronym, RBMK) and the Kalinin nuclear power plant with pressurized water reactors (PWR; equivalent Russian acronym VVER) after their 14-yr storage in the shallow-seated repository at the MosNPO Radon testing ground has confirmed the safety of repositories ensured by confinement properties of borosilicate matrix. The most efficient vitrification technology is based on cold crucible induction melting. If the content of a chemical element in waste exceeds its solubility in glass, a crystalline phase is formed in the course of vitrification, so that the glass ceramics become a matrix for such waste. Vitrified waste with high Fe; Na and Al; Na, Fe, and Al; Na and B is characterized. The composition of frit and its proportion to waste depends on waste composition. This procedure requires careful laboratory testing.

  1. Good Data Can Be Better Data - How Data Management Maturity Can Help Repositories Improve Operations, Data Quality, And Usability, Helping Researchers

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2015-12-01

    Much earth and space science data and metadata are managed and supported by an infrastructure of repositories, ranging from large agency or instrument facilities, to institutions, to smaller repositories including labs. Scientists face many challenges in this ecosystem both on storing their data and in accessing data from others for new research. Critical for all uses is ensuring the credibility and integrity of the data and conveying that and provenance information now and in the future. Accurate information is essential for future researchers to find (or discover) the data, evaluate the data for use (content, temporal, geolocation, precision) and finally select (or discard) that data as meeting a "fit-for-purpose" criteria. We also need to optimize the effort it takes in describing the data for these determinations, which means making it efficient for the researchers who collect the data. At AGU we are developing a program aimed at helping repositories, and thereby researchers, improve data quality and data usability toward these goals. AGU has partnered with the CMMI Institute to develop their Data Management Maturity (DMM) framework within the Earth and space sciences. The CMMI DMM framework guides best practices in a range of data operations, and the application of the DMM, through an assessment, reveals how repositories and institutions can best optimize efforts to improve operations and functionality throughout the data lifecycle and elevate best practices across a variety of data management operations. Supporting processes like data operations, data governance, and data architecture are included. An assessment involves identifying accomplishment, and weaknesses compared to leading practices for data management. Broad application of the DMM can help improve quality in data and operations, and consistency across the community that will facilitate interoperability, discovery, preservation, and reuse. Good data can be better data. Consistency results in sustainability.

  2. Granite disposal of U.S. high-level radioactive waste.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, Geoffrey A.; Mariner, Paul E.; Lee, Joon H.

    This report evaluates the feasibility of disposing U.S. high-level radioactive waste in granite several hundred meters below the surface of the earth. The U.S. has many granite formations with positive attributes for permanent disposal. Similar crystalline formations have been extensively studied by international programs, two of which, in Sweden and Finland, are the host rocks of submitted or imminent repository license applications. This report is enabled by the advanced work of the international community to establish functional and operational requirements for disposal of a range of waste forms in granite media. In this report we develop scoping performance analyses, basedmore » on the applicable features, events, and processes (FEPs) identified by international investigators, to support generic conclusions regarding post-closure safety. Unlike the safety analyses for disposal in salt, shale/clay, or deep boreholes, the safety analysis for a mined granite repository depends largely on waste package preservation. In crystalline rock, waste packages are preserved by the high mechanical stability of the excavations, the diffusive barrier of the buffer, and favorable chemical conditions. The buffer is preserved by low groundwater fluxes, favorable chemical conditions, backfill, and the rigid confines of the host rock. An added advantage of a mined granite repository is that waste packages would be fairly easy to retrieve, should retrievability be an important objective. The results of the safety analyses performed in this study are consistent with the results of comprehensive safety assessments performed for sites in Sweden, Finland, and Canada. They indicate that a granite repository would satisfy established safety criteria and suggest that a small number of FEPs would largely control the release and transport of radionuclides. In the event the U.S. decides to pursue a potential repository in granite, a detailed evaluation of these FEPs would be needed to inform site selection and safety assessment.« less

  3. Organizing the present, looking to the future: an online knowledge repository to facilitate collaboration.

    PubMed

    Burchill, C; Roos, L L; Fergusson, P; Jebamani, L; Turner, K; Dueck, S

    2000-01-01

    Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research Methods, and facilitate both internal communication and collaboration with other sites. This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated.

  4. Organizing the Present, Looking to the Future: An Online Knowledge Repository to Facilitate Collaboration

    PubMed Central

    Burchill, Charles; Fergusson, Patricia; Jebamani, Laurel; Turner, Ken; Dueck, Stephen

    2000-01-01

    Background Comprehensive data available in the Canadian province of Manitoba since 1970 have aided study of the interaction between population health, health care utilization, and structural features of the health care system. Given a complex linked database and many ongoing projects, better organization of available epidemiological, institutional, and technical information was needed. Objective The Manitoba Centre for Health Policy and Evaluation wished to develop a knowledge repository to handle data, document research methods, and facilitate both internal communication and collaboration with other sites. Methods This evolving knowledge repository consists of both public and internal (restricted access) pages on the World Wide Web (WWW). Information can be accessed using an indexed logical format or queried to allow entry at user-defined points. The main topics are: Concept Dictionary, Research Definitions, Meta-Index, and Glossary. The Concept Dictionary operationalizes concepts used in health research using administrative data, outlining the creation of complex variables. Research Definitions specify the codes for common surgical procedures, tests, and diagnoses. The Meta-Index organizes concepts and definitions according to the Medical Sub-Heading (MeSH) system developed by the National Library of Medicine. The Glossary facilitates navigation through the research terms and abbreviations in the knowledge repository. An Education Resources heading presents a web-based graduate course using substantial amounts of material in the Concept Dictionary, a lecture in the Epidemiology Supercourse, and material for Manitoba's Regional Health Authorities. Confidential information (including Data Dictionaries) is available on the Centre's internal website. Results Use of the public pages has increased dramatically since January 1998, with almost 6,000 page hits from 250 different hosts in May 1999. More recently, the number of page hits has averaged around 4,000 per month, while the number of unique hosts has climbed to around 400. Conclusions This knowledge repository promotes standardization and increases efficiency by placing concepts and associated programming in the Centre's collective memory. Collaboration and project management are facilitated. PMID:11720929

  5. Virtual patient repositories--a comparative analysis.

    PubMed

    Küfner, Julia; Kononowicz, Andrzej A; Hege, Inga

    2014-01-01

    Virtual Patients (VPs) are an important component of medical education. One way to reduce the costs for creating VPs is sharing through repositories. We conducted a literature review to identify existing repositories and analyzed the 17 included repositories in regards to the search functions and metadata they provide. Most repositories provided some metadata such as title or description, whereas other data, such as educational objectives, were less frequent. Future research could, in cooperation with the repository provider, investigate user expectations and usage patterns.

  6. Learning Object Repositories

    ERIC Educational Resources Information Center

    Lehman, Rosemary

    2007-01-01

    This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)

  7. Copper Corrosion in Nuclear Waste Disposal: A Swedish Case Study on Stakeholder Insight

    ERIC Educational Resources Information Center

    Andersson, Kjell

    2013-01-01

    The article describes the founding principles, work program, and accomplishments of a Reference Group with both expert and layperson stakeholders for the corrosion of copper canisters in a proposed deep repository in Sweden for spent nuclear fuel. The article sets the Reference Group as a participatory effort within a broader context of…

  8. Streaming the Archives: Repurposing Systems to Advance a Small Media Digitization and Dissemination Program

    ERIC Educational Resources Information Center

    Anderson, Talea

    2015-01-01

    In 2013-2014, Brooks Library at Central Washington University (CWU) launched library content in three systems: a digital asset-management system, an institutional repository (IR), and a web-based discovery layer. In early 2014, the archives at the library began to use these systems to disseminate media recently digitized from legacy formats. As…

  9. Library and Archival Security: Policies and Procedures To Protect Holdings from Theft and Damage.

    ERIC Educational Resources Information Center

    Trinkaus-Randall, Gregor

    1998-01-01

    Firm policies and procedures that address the environment, patron/staff behavior, general attitude, and care and handling of materials need to be at the core of the library/archival security program. Discussion includes evaluating a repository's security needs, collections security, security in non-public areas, security in the reading room,…

  10. The South Florida Avocado Breeding Program at USDA-Agricultural Research Service Subtropical Horticulture Research Station (USDA-ARS SHRS)

    USDA-ARS?s Scientific Manuscript database

    USDA-ARS SHRS is part of the USDA National Germplasm Repository system and houses collections of tropical and subtropical fruit trees such as mango, lychee, and avocado. In addition to maintaining the germplasm collections, our mission is to also identify genetic diversity in the collections, to ev...

  11. Phase Stability Determinations of DWPF Waste Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marra, S.L.

    1999-10-22

    Liquid high-level nuclear waste will be immobilized at the Savannah River Site (SRS) by vitrification in borosilicate glass. To fulfill this requirement, glass samples were heat treated at various times and temperatures. These results will provide guidance to the repository program about conditions to be avoided during shipping, handling and storage of DWPF canistered waste forms.

  12. ScrubChem: Cleaning of PubChem Bioassay Data to Create Diverse and Massive Bioactivity Datasets for Use in Modeling Applications (SOT)

    EPA Science Inventory

    The PubChem Bioassay database is a non-curated public repository with bioactivity data from 64 sources, including: ChEMBL, BindingDb, DrugBank, Tox21, NIH Molecular Libraries Screening Program, and various academic, government, and industrial contributors. However, this data is d...

  13. Transportation plan repository and archive.

    DOT National Transportation Integrated Search

    2011-04-01

    This project created a repository and archive for transportation planning documents in Texas within the : established Texas A&M Repository (http://digital.library.tamu.edu). This transportation planning archive : and repository provides ready access ...

  14. Hydrologic and geologic characteristics of the Yucca Mountain site relevant to the performance of a potential repository: Day 1, Las Vegas, Nevada to Pahrump, Nevada: Stop 6A. Keane Wonder Spring and regional groundwater flow in the Death Valley region

    USGS Publications Warehouse

    Steinkampf, W.C.

    2000-01-01

    Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program, The first day focuses on the regional seeing with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be in southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The filed trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, and element of the hydrologic system that historically has received little attention. Discussions during the second day will comprise selected topics of Yucca Mountain geology, mic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.

  15. Software Hardware Asset Reuse Enterprise (SHARE) Repository Framework: Related Work and Development Plan

    DTIC Science & Technology

    2009-08-19

    designed to collect the data and assist the analyst in drawing relationships between the data. Palantir Technologies has created one such software...application to support the DoD intelligence community by providing robust capabilities for managing data from various sources10. The Palantir tool...www.palantirtech.com/ - 38 - Figure 17. Palantir Graphical Interface (Gordon-Schlosberg, 2008) Similar examples of the use of ontologies to support data

  16. Traumatic Brain Injury Diffusion Magnetic Resonance Imaging Research Roadmap Development Project

    DTIC Science & Technology

    2011-10-01

    promising technology on the horizon is the Diffusion Tensor Imaging ( DTI ). Diffusion tensor imaging ( DTI ) is a magnetic resonance imaging (MRI)-based...in the brain. The potential for DTI to improve our understanding of TBI has not been fully explored and challenges associated with non-existent...processing tools, quality control standards, and a shared image repository. The recommendations will be disseminated and pilot tested. A DTI of TBI

  17. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  18. Review of the MDF-LSA 100 Spray Decontamination System

    DTIC Science & Technology

    2011-12-01

    decontamination technology. In October 2000, SNL received funding from the U.S. Department of Energy’s and National Nuclear Security Administration’s...UNCLASSIFIED DSTO-GD-0662 The MDF-LSA 200 is supplied or created as a foam, liquid or aerosol. The foam can be sprayed from handheld canisters . When the foam...DSTO Publications Repository http://dspace.dsto.defence.gov.au/dspace/ 14. RELEASE AUTHORITY Chief, Human Protection and Performance

  19. Data Stream Mining

    NASA Astrophysics Data System (ADS)

    Gaber, Mohamed Medhat; Zaslavsky, Arkady; Krishnaswamy, Shonali

    Data mining is concerned with the process of computationally extracting hidden knowledge structures represented in models and patterns from large data repositories. It is an interdisciplinary field of study that has its roots in databases, statistics, machine learning, and data visualization. Data mining has emerged as a direct outcome of the data explosion that resulted from the success in database and data warehousing technologies over the past two decades (Fayyad, 1997,Fayyad, 1998,Kantardzic, 2003).

  20. NCBI GEO: archive for functional genomics data sets—10 years on

    PubMed Central

    Barrett, Tanya; Troup, Dennis B.; Wilhite, Stephen E.; Ledoux, Pierre; Evangelista, Carlos; Kim, Irene F.; Tomashevsky, Maxim; Marshall, Kimberly A.; Phillippy, Katherine H.; Sherman, Patti M.; Muertter, Rolf N.; Holko, Michelle; Ayanbule, Oluwabukunmi; Yefanov, Andrey; Soboleva, Alexandra

    2011-01-01

    A decade ago, the Gene Expression Omnibus (GEO) database was established at the National Center for Biotechnology Information (NCBI). The original objective of GEO was to serve as a public repository for high-throughput gene expression data generated mostly by microarray technology. However, the research community quickly applied microarrays to non-gene-expression studies, including examination of genome copy number variation and genome-wide profiling of DNA-binding proteins. Because the GEO database was designed with a flexible structure, it was possible to quickly adapt the repository to store these data types. More recently, as the microarray community switches to next-generation sequencing technologies, GEO has again adapted to host these data sets. Today, GEO stores over 20 000 microarray- and sequence-based functional genomics studies, and continues to handle the majority of direct high-throughput data submissions from the research community. Multiple mechanisms are provided to help users effectively search, browse, download and visualize the data at the level of individual genes or entire studies. This paper describes recent database enhancements, including new search and data representation tools, as well as a brief review of how the community uses GEO data. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/. PMID:21097893

Top