Sample records for archival information system

  1. Archival Information Management System.

    DTIC Science & Technology

    1995-02-01

    management system named Archival Information Management System (AIMS), designed to meet the audit trail requirement for studies completed under the...are to be archived to the extent that future reproducibility and interrogation of results will exist. This report presents a prototype information

  2. The function of the earth observing system - Data information system Distributed Active Archive Centers

    NASA Technical Reports Server (NTRS)

    Lapenta, C. C.

    1992-01-01

    The functionality of the Distributed Active Archive Centers (DAACs) which are significant elements of the Earth Observing System Data and Information System (EOSDIS) is discussed. Each DAAC encompasses the information management system, the data archival and distribution system, and the product generation system. The EOSDIS DAACs are expected to improve the access to earth science data set needed for global change research.

  3. Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive

    NASA Technical Reports Server (NTRS)

    Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.

    1995-01-01

    Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.

  4. Buckets: Smart Objects for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2001-01-01

    Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.

  5. Improvements in Space Geodesy Data Discovery at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, C.; Pollack, N.; Michael, P.

    2011-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.

  6. Technical Concept Document. Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-02-28

    FeNbry 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive for...February 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive...accordance with the DFARS Special Works Clause Developed by: This document, developed under the Software Technology for Adaptable, Reliable Systems

  7. NASA CDDIS: Next Generation System

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.; Woo, J. Y.; Limbacher, R. I.

    2017-12-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to make space geodesy and geodynamics related data and derived products available in a central archive, to maintain information about the archival of these data, to disseminate these data and information in a timely manner to a global scientific research community, and to provide user based tools for the exploration and use of the archive. As the techniques and data volume have increased, the CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. As a major step taken to improve services, the CDDIS has transitioned to a new hardware system and implemented incremental upgrades to a new software system to meet these goals while increasing automation. This new system increases the ability of the CDDIS to consistently track errors and issues associated with data and derived product files uploaded to the system and to perform post-ingest checks on all files received for the archive. In addition, software to process new data sets and changes to existing data sets have been implemented to handle new formats and any issues identified during the ingest process. In this poster, we will discuss the CDDIS archive in general as well as review and contrast the system structures and quality control measures employed before and after the system upgrade. We will also present information about new data sets and changes to existing data and derived products archived at the CDDIS.

  8. Reference Model for an Open Archival Information System

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.

  9. Integration, acceptance testing, and clinical operation of the Medical Information, Communication and Archive System, phase II.

    PubMed

    Smith, E M; Wandtke, J; Robinson, A

    1999-05-01

    The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.

  10. The Role of Archives and Records Management in National Information Systems: A RAMP Study.

    ERIC Educational Resources Information Center

    Rhoads, James B.

    Produced as part of the United Nations Educational, Scientific, and Cultural Organization (UNESCO) Records and Archives Management Programme (RAMP), this publication provides information about the essential character and value of archives and about the procedures and programs that should govern the management of both archives and current records,…

  11. Strategy of Planetary Data Archives in Japanese Missions for Planetary Data System

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Murakami, S. Y.

    2017-12-01

    To preserve data acquired by Japanese planetary explorations for a long time, we need a data archiving strategy in a form suitable for resources. Planetary Data System(PDS) developed by NASA is an excellent system for saving data over a long period. Especially for the current version 4 (PDS4), it is possible to create a data archive with high completeness using information technology. Historically, the Japanese planetary missions have archived data by scientists in their ways, but in the past decade, JAXA has been aiming to conform data to PDS considering long term preservation. Hayabusa, Akatsuki are archived in PDS3. Kaguya(SELENE) data have been newly converted from the original format to PDS3. Hayabusa2 and BepiColombo, and future planetary explorations will release data in PDS4. The cooperation of engineers who are familiar with information technology is indispensable to create data archives for scientists. In addition, it is essential to have experience, information sharing, and a system to support it. There is a challenge in Japan about the system.

  12. CDDIS: NASA's Archive of Space Geodesy Data and Products Supporting GGOS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick

    2016-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data and products in a central archive, to maintain information about the archival of these data,to disseminate these data and information in a timely manner to a global scientific research community, and provide user based tools for the exploration and use of the archive. The CDDIS data system and its archive is a key component in several of the geometric services within the International Association of Geodesy (IAG) and its observing systemthe Global Geodetic Observing System (GGOS), including the IGS, the International DORIS Service (IDS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation and Reference Systems Service (IERS). The CDDIS provides on-line access to over 17 Tbytes of dataand derived products in support of the IAG services and GGOS. The systems archive continues to grow and improve as new activities are supported and enhancements are implemented. Recently, the CDDIS has established a real-time streaming capability for GNSS data and products. Furthermore, enhancements to metadata describing the contents ofthe archive have been developed to facilitate data discovery. This poster will provide a review of the improvements in the system infrastructure that CDDIS has made over the past year for the geodetic community and describe future plans for the system.

  13. Charting the Course: Life Cycle Management of Mars Mission Digital Information

    NASA Technical Reports Server (NTRS)

    Reiz, Julie M.

    2003-01-01

    This viewgraph presentation reviews the life cycle management of MER Project information. This process was an essential key to the successful launch of the MER Project rovers. Incorporating digital information archive requirements early in the project life cycle resulted in: Design of an information system that included archive metadata, Reduced the risk of information loss through in-process appraisal, Easier transfer of project information to institutional online archive and Project appreciation for preserving information for reuse by future projects

  14. MODIS land data at the EROS data center DAAC

    USGS Publications Warehouse

    Jenkerson, Calli B.; Reed, B.C.

    2001-01-01

    The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.

  15. Expert Systems Technology and Its Implication for Archives. National Archives Technical Information Paper No. 9.

    ERIC Educational Resources Information Center

    Michelson, Avra

    This report introduces archivists to the potential of expert systems for improving archives administration and alerts them to ways in which they can expect intelligent technologies to impact federal record-keeping systems and scholarly research methods. The report introduces the topic by describing expert systems used in three Fortune 500…

  16. Kellogg Library and Archive Retrieval System (KLARS) Document Capture Manual. Draft Version.

    ERIC Educational Resources Information Center

    Hugo, Jane

    This manual is designed to supply background information for Kellogg Library and Archive Retrieval System (KLARS) processors and others who might work with the system, outline detailed policies and procedures for processors who prepare and enter data into the adult education database on KLARS, and inform general readers about the system. KLARS is…

  17. EOSDIS: Archive and Distribution Systems in the Year 2000

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Lake, Alla

    2000-01-01

    Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.

  18. Superfund Public Information System (SPIS), January 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-01-01

    The Superfund Public Information System (SPIS) on CD-ROM contains Superfund data for the United States Environmental Protection Agency. The Superfund data is a collection of three databases: Records of Decision (RODS); Comprehensive Environmental, Response, Compensation, and Liability Information System (CERCLIS); and Archive (NFRAP). Descriptions of these databases and CD contents are listed below. Data content: The CD contains the complete text of the official ROD documents signed and issued by EPA from fiscal years 1982--1996; 147 RODs for fiscal year 1997; and seven RODs for fiscal year 1998. The CD also contains 89 Explanation of Significant Difference (ESD) documents, asmore » well as 48 ROD Amendments. CERCLIS and Archive (NFRAP) data is through January 19, 1999. RODS is the Records Of Decision System. RODS is used to track site clean-ups under the Superfund program to justify the type of treatment chosen at each site. RODS contains information on technology justification, site history, community participation, enforcement activities, site characteristics, scope and role of response action, and remedy. Explanation of Significant Differences (ESDs) are also available on the CD. CERCLIS is the Comprehensive Environmental Response, Compensation, and Liability Information System. It is the official repository for all Superfund site and incident data. It contains comprehensive information on hazardous waste sites, site inspections, preliminary assessments, and remedial status. The system is sponsored by the EPA`s Office of Emergency and Remedial Response, Information Management Center. Archive (NFRAP) consists of hazardous waste sites that have no further remedial action planned; only basic identifying information is provided for archive sites. The sites found in the Archive database were originally in the CERCLIS database, but were removed beginning in the fall of 1995.« less

  19. Building a DAM To Last: Archiving Digital Assets.

    ERIC Educational Resources Information Center

    Zeichick, Alan

    2003-01-01

    Discusses archiving digital information and the need for organizations to develop policies regarding digital asset management (DAM) and storage. Topics include determining the value of digital assets; formats of digital information; use of stored information; and system architecture, including hardware and asset management software. (LRW)

  20. [A new concept for integration of image databanks into a comprehensive patient documentation].

    PubMed

    Schöll, E; Holm, J; Eggli, S

    2001-05-01

    Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.

  1. A data and information system for processing, archival, and distribution of data for global change research

    NASA Technical Reports Server (NTRS)

    Graves, Sara J.

    1994-01-01

    Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.

  2. Intelligent Systems Technologies and Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.

    2004-01-01

    The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.

  3. Archiving Space Geodesy Data for 20+ Years at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Dube, M. P.

    2004-01-01

    Since 1982, the Crustal Dynamics Data Information System (CDDIS) has supported the archive and distribution of geodetic data products acquired by NASA programs. These data include GPS (Global Positioning System), GLONASS (GLObal NAvigation Satellite System), SLR (Satellite Laser Ranging), VLBI (Very Long Baseline Interferometry), and DORIS (Doppler Orbitography and Radiolocation Integrated by Satellite). The data archive supports NASA's space geodesy activities through the Solid Earth and Natural Hazards (SENH) program. The CDDIS data system and its archive have become increasingly important to many national and international programs, particularly several of the operational services within the International Association of Geodesy (IAG), including the International GPS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), the International DORIS Service (IDS), and the International Earth Rotation Service (IERS). The CDDIS provides easy and ready access to a variety of data sets, products, and information about these data. The specialized nature of the CDDIS lends itself well to enhancement and thus can accommodate diverse data sets and user requirements. All data sets and metadata extracted from these data sets are accessible to scientists through ftp and the web; general information about each data set is accessible via the web. The CDDIS, including background information about the system and its user communities, the computer architecture, archive contents, available metadata, and future plans will be discussed.

  4. International Reader in the Management of Library, Information and Archive Services.

    ERIC Educational Resources Information Center

    Vaughan, Anthony, Comp.

    Compiled for the benefit of library, archive, and information science schools for use in their information services and systems management courses, this reader is not meant to replace the already existing standard textbooks on this subject, but to provide a more international perspective than textbooks written with the information services of one…

  5. Building A Cloud Based Distributed Active Data Archive Center

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin

    2017-01-01

    NASA's Earth Science Data System (ESDS) Program facilitates the implementation of NASA's Earth Science strategic plan, which is committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. The Earth Science Data information System (ESDIS) project manages the Earth Observing System Data and Information System (EOSDIS). Data within EOSDIS are held at Distributed Active Archive Centers (DAACs). One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data.

  6. Evolution of Information Management at the GSFC Earth Sciences (GES) Data and Information Services Center (DISC): 2006-2007

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Lynnes, Christopher; Vollmer, Bruce; Alcott, Gary; Berrick, Stephen

    2009-01-01

    Increasingly sophisticated National Aeronautics and Space Administration (NASA) Earth science missions have driven their associated data and data management systems from providing simple point-to-point archiving and retrieval to performing user-responsive distributed multisensor information extraction. To fully maximize the use of remote-sensor-generated Earth science data, NASA recognized the need for data systems that provide data access and manipulation capabilities responsive to research brought forth by advancing scientific analysis and the need to maximize the use and usability of the data. The decision by NASA to purposely evolve the Earth Observing System Data and Information System (EOSDIS) at the Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC) and other information management facilities was timely and appropriate. The GES DISC evolution was focused on replacing the EOSDIS Core System (ECS) by reusing the In-house developed disk-based Simple, Scalable, Script-based Science Product Archive (S4PA) data management system and migrating data to the disk archives. Transition was completed in December 2007

  7. Archival Services and the Concept of the User: A RAMP Study.

    ERIC Educational Resources Information Center

    Taylor, Hugh A.

    Prepared under contract with the International Council on Archives (ICA), this study is intended to assist archivists and information specialists in creating, developing, and evaluating modern archival systems and services, particularly with reference to the concept and the role of the user in such systems and services. It ranges over a wide field…

  8. Medical information, communication, and archiving system (MICAS): Phase II integration and acceptance testing

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.

  9. Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.

    PubMed

    Lear, J L; Pratt, J P; Trujillo, N

    1996-02-01

    While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.

  10. NASDA's earth observation satellite data archive policy for the earth observation data and information system (EOIS)

    NASA Technical Reports Server (NTRS)

    Sobue, Shin-ichi; Yoshida, Fumiyoshi; Ochiai, Osamu

    1996-01-01

    NASDA's new Advanced Earth Observing Satellite (ADEOS) is scheduled for launch in August, 1996. ADEOS carries 8 sensors to observe earth environmental phenomena and sends their data to NASDA, NASA, and other foreign ground stations around the world. The downlink data bit rate for ADEOS is 126 MB/s and the total volume of data is about 100 GB per day. To archive and manage such a large quantity of data with high reliability and easy accessibility it was necessary to develop a new mass storage system with a catalogue information database using advanced database management technology. The data will be archived and maintained in the Master Data Storage Subsystem (MDSS) which is one subsystem in NASDA's new Earth Observation data and Information System (EOIS). The MDSS is based on a SONY ID1 digital tape robotics system. This paper provides an overview of the EOIS system, with a focus on the Master Data Storage Subsystem and the NASDA Earth Observation Center (EOC) archive policy for earth observation satellite data.

  11. Contents of the NASA ocean data system archive, version 11-90

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Ocean Data System (NODS) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and surface pigment concentration. NODS will become the Data Archive and Distribution Service of the JPL Distributed Active Archive Center for the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  12. The EOSDIS Version 0 Distributed Active Archive Center for physical oceanography and air-sea interaction

    NASA Technical Reports Server (NTRS)

    Hilland, Jeffrey E.; Collins, Donald J.; Nichols, David A.

    1991-01-01

    The Distributed Active Archive Center (DAAC) at the Jet Propulsion Laboratory will support scientists specializing in physical oceanography and air-sea interaction. As part of the NASA Earth Observing System Data and Information System Version 0 the DAAC will build on existing capabilities to provide services for data product generation, archiving, distribution and management of information about data. To meet scientist's immediate needs for data, existing data sets from missions such as Seasat, Geosat, the NOAA series of satellites and the Global Positioning Satellite system will be distributed to investigators upon request. In 1992, ocean topography, wave and surface roughness data from the Topex/Poseidon radar altimeter mission will be archived and distributed. New data products will be derived from Topex/Poseidon and other sensor systems based on recommendations of the science community. In 1995, ocean wind field measurements from the NASA Scatterometer will be supported by the DAAC.

  13. Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe

    2007-01-01

    A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.

  14. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  15. Data and information system requirements for Global Change Research

    NASA Technical Reports Server (NTRS)

    Skole, David L.; Chomentowski, Walter H.; Ding, Binbin; Moore, Berrien, III

    1992-01-01

    Efforts to develop local information systems for supporting interdisciplinary Global Change Research are described. A prototype system, the Interdisciplinary Science Data and Information System (IDS-DIS), designed to interface the larger archives centers of EOS-DIS is presented. Particular attention is given to a data query information management system (IMS), which has been used to tabulate information of Landsat data worldwide. The use of these data in a modeling analysis of deforestation and carbon dioxide emissions is demonstrated. The development of distributed local information systems is considered to be complementary to the development of central data archives. Global Change Research under the EOS program is likely to result in proliferation of data centers. It is concluded that a distributed system is a feasible and natural way to manage data and information for global change research.

  16. 77 FR 32141 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... records titled ``Internal Collaboration Network''. SUMMARY: The National Archives and Records... 43, the Internal Collaboration Network, which contains files with information on National Archives.... SUPPLEMENTARY INFORMATION: The Internal Collaboration Network is a web- based platform that allows users to...

  17. The NAS Computational Aerosciences Archive

    NASA Technical Reports Server (NTRS)

    Miceli, Kristina D.; Globus, Al; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    In order to further the state-of-the-art in computational aerosciences (CAS) technology, researchers must be able to gather and understand existing work in the field. One aspect of this information gathering is studying published work available in scientific journals and conference proceedings. However, current scientific publications are very limited in the type and amount of information that they can disseminate. Information is typically restricted to text, a few images, and a bibliography list. Additional information that might be useful to the researcher, such as additional visual results, referenced papers, and datasets, are not available. New forms of electronic publication, such as the World Wide Web (WWW), limit publication size only by available disk space and data transmission bandwidth, both of which are improving rapidly. The Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Ames Research Center is in the process of creating an archive of CAS information on the WWW. This archive will be based on the large amount of information produced by researchers associated with the NAS facility. The archive will contain technical summaries and reports of research performed on NAS supercomputers, visual results (images, animations, visualization system scripts), datasets, and any other supporting meta-information. This information will be available via the WWW through the NAS homepage, located at http://www.nas.nasa.gov/, fully indexed for searching. The main components of the archive are technical summaries and reports, visual results, and datasets. Technical summaries are gathered every year by researchers who have been allotted resources on NAS supercomputers. These summaries, together with supporting visual results and references, are browsable by interested researchers. Referenced papers made available by researchers can be accessed through hypertext links. Technical reports are in-depth accounts of tools and applications research projects performed by NAS staff members and collaborators. Visual results, which may be available in the form of images, animations, and/or visualization scripts, are generated by researchers with respect to a certain research project, depicting dataset features that were determined important by the investigating researcher. For example, script files for visualization systems (e.g. FAST, PLOT3D, AVS) are provided to create visualizations on the user's local workstation to elucidate the key points of the numerical study. Users can then interact with the data starting where the investigator left off. Datasets are intended to give researchers an opportunity to understand previous work, 'mine' solutions for new information (for example, have you ever read a paper thinking "I wonder what the helicity density looks like?"), compare new techniques with older results, collaborate with remote colleagues, and perform validation. Supporting meta-information associated with the research projects is also important to provide additional context for research projects. This may include information such as the software used in the simulation (e.g. grid generators, flow solvers, visualization). In addition to serving the CAS research community, the information archive will also be helpful to students, visualization system developers and researchers, and management. Students (of any age) can use the data to study fluid dynamics, compare results from different flow solvers, learn about meshing techniques, etc., leading to better informed individuals. For these users it is particularly important that visualization be integrated into dataset archives. Visualization researchers can use dataset archives to test algorithms and techniques, leading to better visualization systems, Management can use the data to figure what is really going on behind the viewgraphs. All users will benefit from fast, easy, and convenient access to CFD datasets. The CAS information archive hopes to serve as a useful resource to those interested in computational sciences. At present, only information that may be distributed internationally is made available via the archive. Studies are underway to determine security requirements and solutions to make additional information available. By providing access to the archive via the WWW, the process of information gathering can be more productive and fruitful due to ease of access and ability to manage many different types of information. As the archive grows, additional resources from outside NAS will be added, providing a dynamic source of research results.

  18. An Ontology-Based Archive Information Model for the Planetary Science Community

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris

    2008-01-01

    The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.

  19. The Self-Organized Archive: SPASE, PDS and Archive Cooperatives

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hughes, J. S.; Roberts, D. A.; Walker, R. J.; Joy, S. P.

    2005-05-01

    Information systems with high quality metadata enable uses and services which often go beyond the original purpose. There are two types of metadata: annotations which are items that comment on or describe the content of a resource and identification attributes which describe the external properties of the resource itself. For example, annotations may indicate which columns are present in a table of data, whereas an identification attribute would indicate source of the table, such as the observatory, instrument, organization, and data type. When the identification attributes are collected and used as the basis of a search engine, a user can constrain on an attribute, the archive can then self-organize around the constraint, presenting the user with a particular view of the archive. In an archive cooperative where each participating data system or archive may have its own metadata standards, providing a multi-system search engine requires that individual archive metadata be mapped to a broad based standard. To explore how cooperative archives can form a larger self-organized archive we will show how the Space Physics Archive Search and Extract (SPASE) data model will allow different systems to create a cooperative and will use Planetary Data System (PDS) plus existing space physics activities as a demonstration.

  20. Content Platforms Meet Data Storage, Retrieval Needs

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Earth is under a constant barrage of information from space. Whether from satellites orbiting our planet, spacecraft circling Mars, or probes streaking toward the far reaches of the Solar System, NASA collects massive amounts of data from its spacefaring missions each day. NASA s Earth Observing System (EOS) satellites, for example, provide daily imagery and measurements of Earth s atmosphere, oceans, vegetation, and more. The Earth Observing System Data and Information System (EOSDIS) collects all of that science data and processes, archives, and distributes it to researchers around the globe; EOSDIS recently reached a total archive volume of 4.5 petabytes. Try to store that amount of information in your standard, four-drawer file cabinet, and you would need 90 million to get the job done. To manage the flood of information, NASA has explored technologies to efficiently collect, archive, and provide access to EOS data for scientists today and for years to come. One such technology is now providing similar capabilities to businesses and organizations worldwide.

  1. The American Archival Profession and Information Technology Standards.

    ERIC Educational Resources Information Center

    Cox, Richard J.

    1992-01-01

    Discussion of the use of standards by archivists highlights the U.S. MARC AMC (Archives-Manuscript Control) format for reporting archival records and manuscripts; their interest in specific standards being developed for the OSI (Open Systems Interconnection) reference model; and the management of records in electronic formats. (16 references) (LAE)

  2. Picture archiving and communication system--Part one: Filmless radiology and distance radiology.

    PubMed

    De Backer, A I; Mortelé, K J; De Keulenaer, B L

    2004-01-01

    Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.

  3. Contents of the JPL Distributed Active Archive Center (DAAC) archive, version 2-91

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1991-01-01

    The Distributed Active Archive Center (DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea surface height, surface wind vector, sea surface temperature, atmospheric liquid water, and surface pigment concentration. The Jet Propulsion Laboratory DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for the Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  4. The PDS4 Information Model and its Role in Agile Science Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D.

    2017-12-01

    PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.

  5. The System for Quick Search of the Astronomical Objects and Events in the Digital Plate Archives.

    NASA Astrophysics Data System (ADS)

    Sergeev, A. V.; Sergeeva, T. P.

    From the middle of the XIX century observatories all over the world have accumulated about three millions astronomical plates contained the unique information about the Universe which can not be obtained or restored with the help of any newest facilities and technologies but may be useful for many modern astronomical investigations. The threat of astronomical plate archives loss caused by economical, technical or some other causes have put before world astronomical community a problem: the preservation of the unique information kept on those plates. The problem can be solved by transformation of the information from plates to digital form and keeping it on electronic data medium. We began a creation of a system for quick search and analysing of astronomical events and objects in digital plate archive of the Ukrainian Main astronomical observatory of NAS. Connection of the system to Internet will allow a remote user (astronomer or observer) to have access to digital plate archive and to work with it. For providing of the high efficiency of this work the plate database (list of the plates with all information about them and access software) are preparing. Modular structure of the system basic software and standard format of the plate image files allow future development of problem-oriented software for special astronomical researches.

  6. Acquisition, use, and archiving of real-time data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leach, M.J.; Bernstein, H.J.; Tichler, J.L.

    Meteorological information is needed by scientific personnel at Brookhaven National Laboratory (BNL) for various purposes. An automated system, used to acquire, archive, and provide users with weather data, is described. Hardware, software, and some of the examples of the uses of the system are detailed.

  7. Optical Disk Technology and Information.

    ERIC Educational Resources Information Center

    Goldstein, Charles M.

    1982-01-01

    Provides basic information on videodisks and potential applications, including inexpensive online storage, random access graphics to complement online information systems, hybrid network architectures, office automation systems, and archival storage. (JN)

  8. A Waveform Archiving System for the GE Solar 8000i Bedside Monitor.

    PubMed

    Fanelli, Andrea; Jaishankar, Rohan; Filippidis, Aristotelis; Holsapple, James; Heldt, Thomas

    2018-01-01

    Our objective was to develop, deploy, and test a data-acquisition system for the reliable and robust archiving of high-resolution physiological waveform data from a variety of bedside monitoring devices, including the GE Solar 8000i patient monitor, and for the logging of ancillary clinical and demographic information. The data-acquisition system consists of a computer-based archiving unit and a GE Tram Rac 4A that connects to the GE Solar 8000i monitor. Standard physiological front-end sensors connect directly to the Tram Rac, which serves as a port replicator for the GE monitor and provides access to these waveform signals through an analog data interface. Together with the GE monitoring data streams, we simultaneously collect the cerebral blood flow velocity envelope from a transcranial Doppler ultrasound system and a non-invasive arterial blood pressure waveform along a common time axis. All waveform signals are digitized and archived through a LabView-controlled interface that also allows for the logging of relevant meta-data such as clinical and patient demographic information. The acquisition system was certified for hospital use by the clinical engineering team at Boston Medical Center, Boston, MA, USA. Over a 12-month period, we collected 57 datasets from 11 neuro-ICU patients. The system provided reliable and failure-free waveform archiving. We measured an average temporal drift between waveforms from different monitoring devices of 1 ms every 66 min of recorded data. The waveform acquisition system allows for robust real-time data acquisition, processing, and archiving of waveforms. The temporal drift between waveforms archived from different devices is entirely negligible, even for long-term recording.

  9. An Electronic Finding Aid Using Extensible Markup Language (XML) and Encoded Archival Description (EAD).

    ERIC Educational Resources Information Center

    Chang, May

    2000-01-01

    Describes the development of electronic finding aids for archives at the University of Illinois, Urbana-Champaign that used XML (extensible markup language) and EAD (encoded archival description) to enable more flexible information management and retrieval than using MARC or a relational database management system. EAD template is appended.…

  10. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  11. Automated search and retrieval of information from imaged documents using optical correlation techniques

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-10-01

    Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  12. NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data

    NASA Astrophysics Data System (ADS)

    Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.

    2011-12-01

    Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.

  13. Constraint based scheduling for the Goddard Space Flight Center distributed Active Archive Center's data archive and distribution system

    NASA Technical Reports Server (NTRS)

    Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.

  14. Intelligent Systems Technologies to Assist in Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; McConaughy, Gail; Lynnes, Christopher; McDonald, Kenneth; Kempler, Steven

    2003-01-01

    With the launch of several Earth observing satellites over the last decade, we are now in a data rich environment. From NASA's Earth Observing System (EOS) satellites alone, we are accumulating more than 3 TB per day of raw data and derived geophysical parameters. The data products are being distributed to a large user community comprising scientific researchers, educators and operational government agencies. Notable progress has been made in the last decade in facilitating access to data. However, to realize the full potential of the growing archives of valuable scientific data, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system. Potential Intelligent Archive concepts include: 1) Mining archived data holdings using Intelligent Data Understanding algorithms to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services involved in a scientific enterprise; 3) Recognizing the value of results, indexing and formatting them for easy access, and delivering them to concerned individuals; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building (i.e., the transformations from data to information to knowledge) instead of just data pipelining; and 5) Being aware of other nodes in the knowledge building system, participating in open systems interfaces and protocols for virtualization, and collaborative interoperability. This paper presents some of these concepts and identifies issues to be addressed by research in future intelligent systems technology.

  15. Imaged document information location and extraction using an optical correlator

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-12-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). Many of these organizations are converting their paper archives to electronic images, which are then stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources and provide for rapid access to the information contained within these imaged documents. To meet this need, Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provide a means for the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and has the potential to determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  16. NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.

    2011-12-01

    Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.

  17. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  18. Radio data archiving system

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Zanichelli, A.; Dovgan, E.; Nanni, M.; Stagni, M.; Righini, S.; Sponza, M.; Bedosti, F.; Orlati, A.; Smareglia, R.

    2016-07-01

    Radio Astronomical Data models are becoming very complex since the huge possible range of instrumental configurations available with the modern Radio Telescopes. What in the past was the last frontiers of data formats in terms of efficiency and flexibility is now evolving with new strategies and methodologies enabling the persistence of a very complex, hierarchical and multi-purpose information. Such an evolution of data models and data formats require new data archiving techniques in order to guarantee data preservation following the directives of Open Archival Information System and the International Virtual Observatory Alliance for data sharing and publication. Currently, various formats (FITS, MBFITS, VLBI's XML description files and ancillary files) of data acquired with the Medicina and Noto Radio Telescopes can be stored and handled by a common Radio Archive, that is planned to be released to the (inter)national community by the end of 2016. This state-of-the-art archiving system for radio astronomical data aims at delegating as much as possible to the software setting how and where the descriptors (metadata) are saved, while the users perform user-friendly queries translated by the web interface into complex interrogations on the database to retrieve data. In such a way, the Archive is ready to be Virtual Observatory compliant and as much as possible user-friendly.

  19. Development of public science archive system of Subaro Telescope. 2

    NASA Astrophysics Data System (ADS)

    Yamamoto, Naotaka; Noda, Sachiyo; Taga, Masatoshi; Ozawa, Tomohiko; Horaguchi, Toshihiro; Okumura, Shin-Ichiro; Furusho, Reiko; Baba, Hajime; Yagi, Masafumi; Yasuda, Naoki; Takata, Tadafumi; Ichikawa, Shin-Ichi

    2003-09-01

    We report various improvements in a public science archive system, SMOKA (Subaru-Mitaka-Okayama-Kiso Archive system). We have developed a new interface to search observational data of minor bodies in the solar system. In addition, the other improvements (1) to search frames by specifying wavelength directly, (2) to find out calibration data set automatically, (3) to browse data on weather, humidity, and temperature, which provide information of image quality, (4) to provide quick-look images of OHS/CISCO and IRCS, and (5) to include the data from OAO HIDES (HIgh Dispersion Echelle Spectrograph), are also summarized.

  20. Reusing Information Management Services for Recommended Decadal Study Missions to Facilitate Aerosol and Cloud Studies

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve

    2008-01-01

    NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.

  1. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  2. The Production and Archiving of Navigation and Ancillary Data for the Galileo Mission

    NASA Technical Reports Server (NTRS)

    Miller, J.; Clarke, T.

    1994-01-01

    The Galileo Mission to Jupiter is using the SPICE formats developed by the Navigation and Ancillary Information Facility, a node of the Planetary Data System, to archive its navigation and ancillary data.

  3. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  4. The Land Processes Distributed Active Archive Center (LP DAAC)

    USGS Publications Warehouse

    Golon, Danielle K.

    2016-10-03

    The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.

  5. Project MICAS: a multivendor open-system incremental approach to implementing an integrated enterprise-wide PACS: works in progress

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.

    1998-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.

  6. NASA Records Database

    NASA Technical Reports Server (NTRS)

    Callac, Christopher; Lunsford, Michelle

    2005-01-01

    The NASA Records Database, comprising a Web-based application program and a database, is used to administer an archive of paper records at Stennis Space Center. The system begins with an electronic form, into which a user enters information about records that the user is sending to the archive. The form is smart : it provides instructions for entering information correctly and prompts the user to enter all required information. Once complete, the form is digitally signed and submitted to the database. The system determines which storage locations are not in use, assigns the user s boxes of records to some of them, and enters these assignments in the database. Thereafter, the software tracks the boxes and can be used to locate them. By use of search capabilities of the software, specific records can be sought by box storage locations, accession numbers, record dates, submitting organizations, or details of the records themselves. Boxes can be marked with such statuses as checked out, lost, transferred, and destroyed. The system can generate reports showing boxes awaiting destruction or transfer. When boxes are transferred to the National Archives and Records Administration (NARA), the system can automatically fill out NARA records-transfer forms. Currently, several other NASA Centers are considering deploying the NASA Records Database to help automate their records archives.

  7. Long-Term Preservation and Advanced Access Services to Archived Data: The Approach of a System Integrator

    NASA Astrophysics Data System (ADS)

    Petitjean, Gilles; de Hauteclocque, Bertrand

    2004-06-01

    EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.

  8. Autosophy: an alternative vision for satellite communication, compression, and archiving

    NASA Astrophysics Data System (ADS)

    Holtz, Klaus; Holtz, Eric; Kalienky, Diana

    2006-08-01

    Satellite communication and archiving systems are now designed according to an outdated Shannon information theory where all data is transmitted in meaningless bit streams. Video bit rates, for example, are determined by screen size, color resolution, and scanning rates. The video "content" is irrelevant so that totally random images require the same bit rates as blank images. An alternative system design, based on the newer Autosophy information theory, is now evolving, which transmits data "contend" or "meaning" in a universally compatible 64bit format. This would allow mixing all multimedia transmissions in the Internet's packet stream. The new systems design uses self-assembling data structures, which grow like data crystals or data trees in electronic memories, for both communication and archiving. The advantages for satellite communication and archiving may include: very high lossless image and video compression, unbreakable encryption, resistance to transmission errors, universally compatible data formats, self-organizing error-proof mass memories, immunity to the Internet's Quality of Service problems, and error-proof secure communication protocols. Legacy data transmission formats can be converted by simple software patches or integrated chipsets to be forwarded through any media - satellites, radio, Internet, cable - without needing to be reformatted. This may result in orders of magnitude improvements for all communication and archiving systems.

  9. Medical image archive node simulation and architecture

    NASA Astrophysics Data System (ADS)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape access time, number of drives, number of exams per patient, number of Central Processing Units, patient grouping, and priority impacts. The MIADS, which could be a key component of a broader data repository system, will be able to communicate with and obtain data from existing hospital information systems. We will discuss the external interfaces enabling MIADS to communicate with and obtain data from existing Radiology Information Systems such as the Picture Archiving and Communication System (PACS). Our system design encompasses the broader aspects of the archive node, which could include multimedia data such as image, audio, video, and free text data. This system is designed to be integrated with current hospital PACS through a Digital Imaging and Communications in Medicine interface. However, the system can also be accessed through the Internet using Hypertext Transport Protocol or Simple File Transport Protocol. Our design and simulation work will be key to implementing a successful, scalable medical image archive and distribution system.

  10. Data catalog for JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC)

    NASA Technical Reports Server (NTRS)

    Digby, Susan

    1995-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory contains satellite data sets and ancillary in-situ data for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Geophysical parameters available from the archive include sea-surface height, surface-wind vector, surface-wind speed, surface-wind stress vector, sea-surface temperature, atmospheric liquid water, integrated water vapor, phytoplankton pigment concentration, heat flux, and in-situ data. PO.DAAC is an element of the Earth Observing System Data and Information System and is the United States distribution site for TOPEX/POSEIDON data and metadata.

  11. Reusing Information Management Services for Recommended Decadal Study Missions That Facilitate Aerosol and Cloud Studies

    NASA Astrophysics Data System (ADS)

    Alcott, G.; Kempler, S.; Lynnes, C.; Leptoukh, G.; Vollmer, B.; Berrick, S.

    2008-12-01

    NASA Earth Sciences Division (ESD), and its preceding Earth science organizations, has made great investments in the development and maintenance of data management systems, as well as information technologies, for the purpose of maximizing the use and usefulness of NASA generated Earth science data. Earth science information systems, evolving with the maturation and implementation of advancing technologies, reside at NASA data centers, known as Distributed Active Archive Centers (DAACs). With information management system infrastructure in place, and system data and user services already developed and operational, only very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) (one of NASAs DAACs) and their potential reuse for these future missions. After 14 years working with instrument teams and the broader science community, GES DISC personnel expertise in atmospheric, water cycle, and atmospheric modeling data and information services, as well as Earth science missions, information system engineering, operations, and user services have developed a series of modular, reusable data management components currently is use in several projects. The knowledge and experience gained at the GES DISC lend themselves to providing science driven information systems in the areas of aerosols, clouds, and atmospheric chemicals to be measured by recommended Decadal Survey missions. Available reusable capabilities include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. In addition, recent enhancements, such as Open Geospatial Consortium (OGC), Inc. interoperability implementations and data fusion prototypes, will be described. As a result of the information management systems developed by NASAs GES DISC, not only are large cost savings realized through system reuse, but maintenance costs are also minimized due to the simplicity of their implementations.

  12. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  13. The Apache OODT Project: An Introduction

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.; Crichton, D. J.; Hughes, J. S.; Ramirez, P.; Goodale, C. E.; Hart, A. F.

    2012-12-01

    Apache OODT is a science data system framework, borne over the past decade, with 100s of FTEs of investment, tens of sponsoring agencies (NASA, NIH/NCI, DoD, NSF, universities, etc.), and hundreds of projects and science missions that it powers everyday to their success. At its core, Apache OODT carries with it two fundamental classes of software services and components: those that deal with information integration from existing science data repositories and archives, that themselves have already-in-use business processes and models for populating those archives. Information integration allows search, retrieval, and dissemination across these heterogeneous systems, and ultimately rapid, interactive data access, and retrieval. The other suite of services and components within Apache OODT handle population and processing of those data repositories and archives. Workflows, resource management, crawling, remote data retrieval, curation and ingestion, along with science data algorithm integration all are part of these Apache OODT software elements. In this talk, I will provide an overview of the use of Apache OODT to unlock and populate information from science data repositories and archives. We'll cover the basics, along with some advanced use cases and success stories.

  14. An Update on the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick; Dube, Maurice P.; Pollack, N.

    2012-01-01

    The Crustal Dynamics Data Inforn1ation System (CoorS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank, to maintain infom1ation about the archival of these data, and to disseminate these data and information in a timely mam1er to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and OORIS data sets and products derived from these data. The coors is one of NASA's Earth Observing System Oata and Infom1ation System (EOSorS) distributed data centers; EOSOIS data centers serve a diverse user community and are tasked to provide facilities to search and access science data and products. The coors data system and its archive have become increasingly important to many national and international science communities, in pal1icular several of the operational services within the International Association of Geodesy (lAG) and its project the Global Geodetic Observing System (GGOS), including the International OORIS Service (IDS), the International GNSS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation Service (IERS). The coors has recently expanded its archive to supp011 the IGS Multi-GNSS Experiment (MGEX). The archive now contains daily and hourly 3D-second and subhourly I-second data from an additional 35+ stations in RINEX V3 fOm1at. The coors will soon install an Ntrip broadcast relay to support the activities of the IGS Real-Time Pilot Project (RTPP) and the future Real-Time IGS Service. The coors has also developed a new web-based application to aid users in data discovery, both within the current community and beyond. To enable this data discovery application, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. This poster will include background information about the system and its user communities, archive contents and updates, enhancements for data discovery, new system architecture, and future plans.

  15. The Road to Independently Understandable Information

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Robinson, E.

    2017-12-01

    The turn of the 21st century was a pivotal time in the Earth and Space Science information ecosystem. The Content Standard for Digital Geospatial Metadata (CSDGM) had existed for nearly a decade and ambitious new standards were just emerging. The U.S. Federal Geospatial Data Committee (FGDC) had extended many of the concepts from CSDGM into the International community with ISO 19115:2003 and the Consultative Committee for Space Data Systems (CCSDS) had migrated their Open Archival Information System (OAIS) Reference Model into an international standard (ISO 14721:2003). The OAIS model outlined the roles and responsibilities of archives with the principle role being preserving information and making it available to users, a "designated community", as a service to the data producer. It was mandatory for the archive to ensure that information is "independently understandable" to the designated community and to maintain that understanding through on-going partnerships between archives and designated communities. Standards can play a role in supporting these partnerships as designated communities expand across disciplinary and geographic boundaries. The ISO metadata standards include many capabilities that might make critical contributions to this goal. These include connections to resources outside of the metadata record (i.e. documentation) and mechanisms for ongoing incorporation of user feedback into the metadata stream. We will demonstrate these capabilities with examples of how they can increase understanding.

  16. The ESIS query environment pilot project

    NASA Technical Reports Server (NTRS)

    Fuchs, Jens J.; Ciarlo, Alessandro; Benso, Stefano

    1993-01-01

    The European Space Information System (ESIS) was originally conceived to provide the European space science community with simple and efficient access to space data archives, facilities with which to examine and analyze the retrieved data, and general information services. To achieve that ESIS will provide the scientists with a discipline specific environment for querying in a uniform and transparent manner data stored in geographically dispersed archives. Furthermore it will provide discipline specific tools for displaying and analyzing the retrieved data. The central concept of ESIS is to achieve a more efficient and wider usage of space scientific data, while maintaining the physical archives at the institutions which created them, and has the best background for ensuring and maintaining the scientific validity and interest of the data. In addition to coping with the physical distribution of data, ESIS is to manage also the heterogenity of the individual archives' data models, formats and data base management systems. Thus the ESIS system shall appear to the user as a single database, while it does in fact consist of a collection of dispersed and locally managed databases and data archives. The work reported in this paper is one of the results of the ESIS Pilot Project which is to be completed in 1993. More specifically it presents the pilot ESIS Query Environment (ESIS QE) system which forms the data retrieval and data dissemination axis of the ESIS system. The others are formed by the ESIS Correlation Environment (ESIS CE) and the ESIS Information Services. The ESIS QE Pilot Project is carried out for the European Space Agency's Research and Information center, ESRIN, by a Consortium consisting of Computer Resources International, Denmark, CISET S.p.a, Italy, the University of Strasbourg, France and the Rutherford Appleton Laboratories in the U.K. Furthermore numerous scientists both within ESA and space science community in Europe have been involved in defining the core concepts of the ESIS system.

  17. BIOME: A scientific data archive search-and-order system using browser-aware, dynamic pages.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jennings, S.V.; Yow, T.G.; Ng, V.W.

    1997-08-01

    The Oak Ridge National Laboratory`s (ORNL) Distributed Active Archive Center (DAAC) is a data archive and distribution center for the National Air and Space Administration`s (NASA) Earth Observing System Data and Information System (EOSDIS). Both the Earth Observing System (EOS) and EOSDIS are components of NASA`s contribution to the US Global Change Research Program through its Mission to Planet Earth Program. The ORNL DAAC provides access to data used in ecological and environmental research such as global change, global warming, and terrestrial ecology. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to helpmore » users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC has developed the Biogeochemical Information Ordering Management Environment (BIOME), a customized search and order system for the World Wide Web (WWW). BIOME is a public system located at http://www-eosdis.ornl.gov/BIOME/biome.html.« less

  18. BIOME: A scientific data archive search-and-order system using browser-aware, dynamic pages

    NASA Technical Reports Server (NTRS)

    Jennings, S. V.; Yow, T. G.; Ng, V. W.

    1997-01-01

    The Oak Ridge National Laboratory's (ORNL) Distributed Active Archive Center (DAAC) is a data archive and distribution center for the National Air and Space Administration's (NASA) Earth Observing System Data and Information System (EOSDIS). Both the Earth Observing System (EOS) and EOSDIS are components of NASA's contribution to the US Global Change Research Program through its Mission to Planet Earth Program. The ORNL DAAC provides access to data used in ecological and environmental research such as global change, global warming, and terrestrial ecology. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC has developed the Biogeochemical Information Ordering Management Environment (BIOME), a customized search and order system for the World Wide Web (WWW). BIOME is a public system located at http://www-eosdis. ornl.gov/BIOME/biome.html.

  19. GENESIS: GPS Environmental and Earth Science Information System

    NASA Technical Reports Server (NTRS)

    Hajj, George

    1999-01-01

    This presentation reviews the GPS ENvironmental and Earth Science Information System (GENESIS). The objectives of GENESIS are outlined (1) Data Archiving, searching and distribution for science data products derived from Space borne TurboRogue Space Receivers for GPS science and other ground based GPS receivers, (2) Data browsing using integrated visualization tools, (3) Interactive web/java-based data search and retrieval, (4) Data subscription service, (5) Data migration from existing GPS archived data, (6) On-line help and documentation, and (7) participation in the WP-ESIP federation. The presentation reviews the products and services of Genesis, and the technology behind the system.

  20. Land processes distributed active archive center product lifecycle plan

    USGS Publications Warehouse

    Daucsavage, John C.; Bennett, Stacie D.

    2014-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.

  1. Satellite and earth science data management activities at the U.S. geological survey's EROS data center

    USGS Publications Warehouse

    Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.

    1991-01-01

    The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.

  2. JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC) data availability, version 1-94

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and integrated water vapor. The JPL PO.DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and is the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  3. To Preserve the Sense of Earth from Space. A Report of the Panel on the Information Policy Implications of Archiving Satellite Data, regarding the Archiving Requirements of the Proposed Transfer to the Private Section of the U. S. Civil Space Remote-Sensing Satellite Systems.

    ERIC Educational Resources Information Center

    National Commission on Libraries and Information Science, Washington, DC.

    This report presents the results of a 3-month effort to assess the archiving requirements that should be imposed in the event of a transfer of the United States land remote-sensing satellite systems to the private sector. The emphasis is not on judging the desirability of the proposed transfer, but on recommending the requirements that should be…

  4. The Role of Computers in Archives.

    ERIC Educational Resources Information Center

    Cook, Michael

    1989-01-01

    Discusses developments in information technologies, their present state of application, and their general significance for the future of archives and records management systems. The likely impact of future technological developments is considered and the need for infrastructural standards, professional cooperation, and training is emphasized.…

  5. Superfund Public Information System (SPIS), June 1998 (on CD-ROM). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-06-01

    The Superfund Public Information System (SPIS) on CD-ROM contains Superfund data for the United States Environmental Protection Agency. The Superfund data is a collection of four databases, CERCLIS, Archive (NFRAP), RODS, and NPL Sites. Descriptions of these databases and CD contents are listed below. The FolioViews browse and retrieval engine is used as a graphical interface to the data. Users can access simple queries and can do complex searching on key words or fields. In addition, context sensitive help, a Superfund process overview, and an integrated data dictionary are available. RODS is the Records Of Decision System. RODS is usedmore » to track site clean-ups under the Superfund program to justify the type of treatment chosen at each site. RODS contains information on technology justification, site history, community participation, enforcement activities, site characteristics, scope and role of response action, and remedy. Explanation of Significant Differences (ESDs) are also available on the CD. CERCLIS is the Comprehensive Environmental Response, Compensation, and Liability Information System. It is the official repository for all Superfund site and incident data. It contains comprehensive information on hazardous waste sites, site inspections, preliminary assessments, and remedial status. The system is sponsored by the EPA`s Office of Emergency and Remedial Response, Information Management Center. Archive (NFRAP) consists of hazardous waste sites that have no further remedial action planned; only basic identifying information is provided for archive sites. The sites found in the Archive database were originally in the CERCLIS database, but were removed beginning in the fall of 1995. NPL sites (available online) are fact sheets that describe the location and history of Superfund sites. Included are descriptions of the most recent activities and past actions at the sites that have contributed to the contamination. Population estimates, land usages, and nearby resources give background on the local setting surrounding a site.« less

  6. Bridging the gap: linking a legacy hospital information system with a filmless radiology picture archiving and communications system within a nonhomogeneous environment.

    PubMed

    Rubin, R K; Henri, C J; Cox, R D

    1999-05-01

    A health level 7 (HL7)-conformant data link to exchange information between the mainframe hospital information system (HIS) of our hospital and our home-grown picture archiving and communications system (PACS) is a result of a collaborative effort between the HIS department and the PACS development team. Based of the ability to link examination requisitions and image studies, applications have been generated to optimise workflow and to improve the reliability and distribution of radiology information. Now, images can be routed to individual radiologists and clinicians; worklists facilitate radiology reporting; applications exist to create, edit, and view reports and images via the internet; and automated quality control now limits the incidence of "lost" cases and errors in image routing. By following the HL7 standard to develop the gateway to the legacy system, the development of a radiology information system for booking, reading, reporting, and billing remains universal and does not preclude the option to integrate off-the-shelf commercial products.

  7. An analysis of student privacy rights in the use of plagiarism detection systems.

    PubMed

    Brinkman, Bo

    2013-09-01

    Plagiarism detection services are a powerful tool to help encourage academic integrity. Adoption of these services has proven to be controversial due to ethical concerns about students' rights. Central to these concerns is the fact that most such systems make permanent archives of student work to be re-used in plagiarism detection. This computerization and automation of plagiarism detection is changing the relationships of trust and responsibility between students, educators, educational institutions, and private corporations. Educators must respect student privacy rights when implementing such systems. Student work is personal information, not the property of the educator or institution. The student has the right to be fully informed about how plagiarism detection works, and the fact that their work will be permanently archived as a result. Furthermore, plagiarism detection should not be used if the permanent archiving of a student's work may expose him or her to future harm.

  8. NOAA's Big Data Partnership at the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2015-12-01

    In April of 2015, the U.S. Department of Commerce announced NOAA's Big Data Partnership (BDP) with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp., and the Open Cloud Consortium through Cooperative Research and Development Agreements. Recent progress on the activities with these Partners at the National Centers for Environmental Information (NCEI) will be presented. These activities include the transfer of over 350 TB of NOAA's archived data from NCEI's tape-based archive system to BDP cloud providers; new opportunities for data mining and investigation; application of NOAA's data maturity and stewardship concepts to the BDP; and integration of both archived and near-realtime data streams into a synchronized, distributed data system. Both lessons learned and future opportunities for the environmental data community will be presented.

  9. Archive data base and handling system for the Orbiter flying qualities experiment program

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Dimarco, R.; Magdaleno, R. E.; Aponso, B. L.

    1986-01-01

    The OFQ archives data base and handling system assembled as part of the Orbiter Flying Qualities (OFQ) research of the Orbiter Experiments Program (EOX) are described. The purpose of the OFQ archives is to preserve and document shuttle flight data relevant to vehicle dynamics, flight control, and flying qualities in a form that permits maximum use for qualified users. In their complete form, the OFQ archives contain descriptive text (general information about the flight, signal descriptions and units) as well as numerical time history data. Since the shuttle program is so complex, the official data base contains thousands of signals and very complex entries are required to obtain data. The OFQ archives are intended to provide flight phase oriented data subsets with relevant signals which are easily identified for flying qualities research.

  10. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  11. An archiving system for Planetary Mapping Data - Availability of derived information and knowledge in Planetary Science!

    NASA Astrophysics Data System (ADS)

    Nass, A.

    2017-12-01

    Since the late 1950s a huge number of planetary missions started to explore our solar system. The data resulting from this robotic exploration and remote sensing varies in data type, resolution and target. After data preprocessing, and referencing, the released data are available for the community on different portals and archiving systems, e.g. PDS or PSA. One major usage for these data is mapping, i.e. the extraction and filtering of information by combining and visualizing different kind of base data. Mapping itself is conducted either for mission planning (e.g. identification of landing site) or fundamental research (e.g. reconstruction of surface). The mapping results for mission planning are directly managed within the mission teams. The derived data for fundamental research - also describable as maps, diagrams, or analysis results - are mainly project-based and exclusively available in scientific papers. Within the last year, first steps have been taken to ensure a sustainable use of these derived data by finding an archiving system comparable to the data portals, i.e. reusable, well-documented, and sustainable. For the implementation three tasks are essential. Two tasks have been treated in the past 1. Comparability and interoperability has been made possible by standardized recommendations for visual, textual, and structural description of mapping data. 2. Interoperability between users, information- and graphic systems is possible by templates and guidelines for digital GIS-based mapping. These two steps are adapted e.g. within recent mapping projects for the Dawn mission. The third task hasn`t been implemented thus far: Establishing an easily detectable and accessible platform that holds already acquired information and published mapping results for future investigations or mapping projects. An archive like this would support the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management, and Data Warehousing. This contribution describes the necessary map archive components that have to be considered for an efficient establishment and user-oriented accessibility. It will be described how already existing developments could be used, and which components will have to be developed yet.

  12. Status of worldwide Landsat archive

    USGS Publications Warehouse

    Warriner, Howard W.

    1987-01-01

    In cooperation with the International Landsat community, and through the Landsat Technical Working Group (LTWG), NOAA is assembling information about the status of the Worldwide Landsat Archive. During LTWG 9, member nations agreed to participate in a survey of International Landsat data holding and of their archive experiences with Landsat data. The goal of the effort was two-fold; one, to document the Landsat archive to date, and, two, to ensure that specific nations' experience with long-term Landsat archival problems were available to others. The survey requested details such as amount of data held, the format of the archive holdings by Spacecraft/Sensor, and acquisition years; the estimated costs to accumulated process, and replace the data (if necessary); the storage space required, and any member nation's plans that would establish the insurance of continuing quality. As a group, the LTWG nations are concerned about the characteristics and reliability of long-term magnetic media storage. Each nation's experience with older data retrieval is solicited in the survey. This information will allow nations to anticipate and plan for required changes to their archival holdings. Also solicited were reports of any upgrades to a nation's archival system that are currently planned and all results of attempts to reduce archive holdings including methodology, current status, and the planned access rates and product support that are anticipated for responding to future archival usage.

  13. System and Method for Providing a Climate Data Persistence Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)

    2018-01-01

    A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.

  14. Visual information mining in remote sensing image archives

    NASA Astrophysics Data System (ADS)

    Pelizzari, Andrea; Descargues, Vincent; Datcu, Mihai P.

    2002-01-01

    The present article focuses on the development of interactive exploratory tools for visually mining the image content in large remote sensing archives. Two aspects are treated: the iconic visualization of the global information in the archive and the progressive visualization of the image details. The proposed methods are integrated in the Image Information Mining (I2M) system. The images and image structure in the I2M system are indexed based on a probabilistic approach. The resulting links are managed by a relational data base. Both the intrinsic complexity of the observed images and the diversity of user requests result in a great number of associations in the data base. Thus new tools have been designed to visualize, in iconic representation the relationships created during a query or information mining operation: the visualization of the query results positioned on the geographical map, quick-looks gallery, visualization of the measure of goodness of the query, visualization of the image space for statistical evaluation purposes. Additionally the I2M system is enhanced with progressive detail visualization in order to allow better access for operator inspection. I2M is a three-tier Java architecture and is optimized for the Internet.

  15. Image dissemination and archiving.

    PubMed

    Robertson, Ian

    2007-08-01

    Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.

  16. (abstract) Satellite Physical Oceanography Data Available From an EOSDIS Archive

    NASA Technical Reports Server (NTRS)

    Digby, Susan A.; Collins, Donald J.

    1996-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory archives and distributes data as part of the Earth Observing System Data and Information System (EOSDIS). Products available from JPL are largely satellite derived and include sea-surface height, surface-wind speed and vectors, integrated water vapor, atmospheric liquid water, sea-surface temperature, heat flux, and in-situ data as it pertains to satellite data. Much of the data is global and spans fourteen years.There is email access, a WWW site, product catalogs, and FTP capabilities. Data is free of charge.

  17. The Crustal Dynamics Data Information System: A Resource to Support Scientific Analysis Using Space Geodesy

    NASA Technical Reports Server (NTRS)

    Noll. Carey E.

    2010-01-01

    Since 1982. the Crustal Dynamics Data Information System (CDDIS) has supported the archive and distribution of geodetic data products acquired by the National Aeronautics and Space Administration (NASA) as well as national and international programs. The CDDIS provides easy, timely, and reliable access to a variety of data sets, products, and information about these data. These measurements. obtained from a global network of nearly 650 instruments at more than 400 distinct sites, include DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite), GNSS (Global Navigation Satellite System), SLR and LLR (Satellite and Lunar Laser Ranging), and VLBI (Very Long Baseline Interferometry). The CDDIS data system and its archive have become increasingly important to many national and international science communities, particularly several of the operational services within the International Association of Geodesy (IAG) and its observing system the Global Geodetic Observing System (GGOS), including the International DORIS Service (IDS), the International GNSS Service (IGS). the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS). and the International Earth rotation and Reference frame Service (IERS), Investigations resulting from the data and products available through the CDDIS support research in many aspects of Earth system science and global change. Each month, the CDDIS archives more than one million data and derived product files totaling over 90 Gbytes in volume. In turn. the global user community downloads nearly 1.2 TBytes (over 10.5 million files) of data and products from the CDDIS each month. The requirements of analysts have evolved since the start of the CDDIS; the specialized nature of the system accommodates the enhancements required to support diverse data sets and user needs. This paper discusses the CDDIS. including background information about the system and its. user communities. archive contents. available metadata, and future plans.

  18. 78 FR 65011 - Privacy Act of 1974: New System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... coverage on the individual and small group health insurance markets on the Affordable Insurance Exchanges... disclosed if a subpoena has been signed by a judge. e. For the National Archives and Records Administration or the General Services Administration--To disclose information to the National Archives and Records...

  19. Meta Data Mining in Earth Remote Sensing Data Archives

    NASA Astrophysics Data System (ADS)

    Davis, B.; Steinwand, D.

    2014-12-01

    Modern search and discovery tools for satellite based remote sensing data are often catalog based and rely on query systems which use scene- (or granule-) based meta data for those queries. While these traditional catalog systems are often robust, very little has been done in the way of meta data mining to aid in the search and discovery process. The recently coined term "Big Data" can be applied in the remote sensing world's efforts to derive information from the vast data holdings of satellite based land remote sensing data. Large catalog-based search and discovery systems such as the United States Geological Survey's Earth Explorer system and the NASA Earth Observing System Data and Information System's Reverb-ECHO system provide comprehensive access to these data holdings, but do little to expose the underlying scene-based meta data. These catalog-based systems are extremely flexible, but are manually intensive and often require a high level of user expertise. Exposing scene-based meta data to external, web-based services can enable machine-driven queries to aid in the search and discovery process. Furthermore, services which expose additional scene-based content data (such as product quality information) are now available and can provide a "deeper look" into remote sensing data archives too large for efficient manual search methods. This presentation shows examples of the mining of Landsat and Aster scene-based meta data, and an experimental service using OPeNDAP to extract information from quality band from multiple granules in the MODIS archive.

  20. 77 FR 8901 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ... request information from or copies of Selective Service System (SSS) records. The public is invited to... Archives and Records Administration (NARA) administers the Selective Service System (SSS) records. The SSS..., 1960. When registrants or other authorized individuals request information from or copies of SSS...

  1. SysML model of exoplanet archive functionality and activities

    NASA Astrophysics Data System (ADS)

    Ramirez, Solange

    2016-08-01

    The NASA Exoplanet Archive is an online service that serves data and information on exoplanets and their host stars to help astronomical research related to search for and characterization of extra-solar planetary systems. In order to provide the most up to date data sets to the users, the exoplanet archive performs weekly updates that include additions into the database and updates to the services as needed. These weekly updates are complex due to interfaces within the archive. I will be presenting a SysML model that helps us perform these update activities in a weekly basis.

  2. Supporting users through integrated retrieval, processing, and distribution systems at the Land Processes Distributed Active Archive Center

    USGS Publications Warehouse

    Kalvelage, Thomas A.; Willems, Jennifer

    2005-01-01

    The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.

  3. Strategic Plan for Information Systems and Technology, Fiscal Years 1994-1998.

    ERIC Educational Resources Information Center

    National Archives and Records Administration, Washington, DC.

    The information systems and technology management program of the National Archives and Records Administration (NARA) establishes broad policy guidance and technical standards for information management to ensure that appropriate resource sharing can occur, while providing cost-effective support for mission requirements of program offices. The NARA…

  4. Image acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Reardon, Frank J.; Salutz, James R.

    1991-07-01

    The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.

  5. Obstacles to the Access, Use and Transfer of Information from Archives: A RAMP Study.

    ERIC Educational Resources Information Center

    Duchein, Michel

    This publication reviews means of access to information contained in the public archives (current administrative documents and archival records) and private archives (manuscripts of personal or family origin) of many countries and makes recommendations for improving access to archival information. Sections describe: (1) the origin and development…

  6. Aircraft scanner data availability via the version 0 Information Management System

    NASA Technical Reports Server (NTRS)

    Mah, G. R.

    1995-01-01

    As part of the Earth Observing System Data and Information System (EOSDIS) development, NASA and other government agencies have developed an operational prototype of the Information Management System (IMS). The IMS provides access to the data archived at the Distributed Active Archive Centers (DAAC's) that allows users to search through metadata describing the (image) data. Criteria based on sensor name or type, date and time, and geographic location are used to search the archive. Graphical representations of coverage and browse images are available to further refine a user's selection. previously, the EROS Data Center (EDC) DAAC had identified the Advanced SOlid-state Array Spectrometer (ASAS), Airborne Visible and infrared Imaging Spectrometer (AVIRIS), NS-001, and Thermal Infrared Multispectral Scanner (TIMS) as precursor data sets similar to those the DAAC will handle in the Earth Observing System era. Currently, the EDC DAAC staff, in cooperation with NASA, has transcribed TIMS, NS-001, and Thematic Mapper Simulation (TMS) data from Ames Research Center and also TIMS data from Stennis Space Center. During the transcription process, the IMS metadata and browse images were created to populate the inventory at the EDC DAAC. These data sets are now available in the IMS and may be requested from the any of the DAAC's via the IMS.

  7. Education Policy Analysis Archives, 2001: Numbers 46-51.

    ERIC Educational Resources Information Center

    Glass, Gene V., Ed.

    2001-01-01

    This document consists of articles 46 through 51 published in the electronic journal Education Policy Analysis Archives for the year 2001: (46) Second Year Analysis of a Hybrid Schedule High School (James B. Shreiber, William R. Veal, David J. Flinders, and Sherry Churchill; (47) Knowledge Management for Educational Information Systems: What Is…

  8. Archival storage solutions for PACS

    NASA Astrophysics Data System (ADS)

    Chunn, Timothy

    1997-05-01

    While they are many, one of the inhibitors to the wide spread diffusion of PACS systems has been robust, cost effective digital archive storage solutions. Moreover, an automated Nearline solution is key to a central, sharable data repository, enabling many applications such as PACS, telemedicine and teleradiology, and information warehousing and data mining for research such as patient outcome analysis. Selecting the right solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, configuration architecture and flexibility, subsystem availability and reliability, security requirements, system cost, achievable benefits and cost savings, investment protection, strategic fit and more.This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on storage system throughput will be analyzed. The concept of automated migration of images from high performance, high cost storage devices to high capacity, low cost storage devices will be introduced as a viable way to minimize overall storage costs for an archive. The concept of access density will also be introduced and applied to the selection of the most cost effective archive solution.

  9. Rendering an archive in three dimensions

    NASA Astrophysics Data System (ADS)

    Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.

    2003-05-01

    We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.

  10. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Astrophysics Data System (ADS)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  11. 47 CFR 80.1133 - Transmission of safety communications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES STATIONS IN THE MARITIME SERVICES Global Maritime Distress and Safety System (GMDSS) Operating... (Reference Information Center) or at the National Archives and Records Administration (NARA). For information...

  12. Digital information management: a progress report on the National Digital Mammography Archive

    NASA Astrophysics Data System (ADS)

    Beckerman, Barbara G.; Schnall, Mitchell D.

    2002-05-01

    Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.

  13. Optimal extraction of quasar Lyman limit absorption systems from the IUE archive

    NASA Technical Reports Server (NTRS)

    Tytler, David

    1992-01-01

    The IUE archive contains a wealth of information on Lyman limit absorption systems (LLS) in quasar spectra. QSO spectra from the IUE data base were optimally extracted, coadded, and analyzed to yield a homogeneous samples of LLS at low red shifts. This sample comprise 36 LLS, twice the number previously analyzed low z samples. These systems are ideal for the determination of the origin, redshift evolution, ionization, velocity dispersions and the metal abundances of absorption systems. Two of them are also excellent targets for the primordial Deuterium to Hydrogen ratio.

  14. Atmospheric Composition Data and Information Services Center (ACDISC)

    NASA Technical Reports Server (NTRS)

    Kempler, S.

    2005-01-01

    NASA's GSFC Earth Sciences (GES) Data and Information and Data Services Center (DISC) manages the archive, distribution and data access for atmospheric composition data from AURA'S OMI, MLS, and hopefully one day, HIRDLS instruments, as well as heritage datasets from TOMS, UARS, MODIS, and AIRS. This data is currently archived in the GES Distributed Active Archive Center (DAAC). The GES DISC has begun the development of a community driven data management system that's sole purpose is to manage and provide value added services to NASA's Atmospheric Composition (AC) Data. This system, called the Atmospheric Composition Data and Information Services Center (ACDISC) will provide access all AC datasets from the above mentioned instruments, as well as AC datasets residing at remote archive sites (e.g, LaRC DAAC) The goals of the ACDISC are to: 1) Provide a data center for Atmospheric Scientists, guided by Atmospheric Scientists; 2) Be absolutely responsive to the data and data service needs of the Atmospheric Composition (AC) community; 3) Provide services (i.e., expertise) that will facilitate the effortless access to and usage of AC data; 4) Collaborate with AC scientists to facilitate the use of data from multiple sensors for long term atmospheric research. The ACDISC is an AC specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques that facilitate science data usage. The purpose of this presentation is to provide the evolution path that the GES DISC in order to better serve AC data, and also to receive continued community feedback and further foster collaboration with AC data users and providers.

  15. Selection and implementation of a distributed phased archive for a multivendor incremental approach to PACS

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The selection criteria for the archive were based on the objectives of the Medical Information, Communication and Archive System (MICAS), a multi-vendor incremental approach to PACS. These objectives include interoperability between all components, seamless integration of the Radiology Information System (RIS) with MICAS and eventually other hospital databases, all components must demonstrate DICOM compliance prior to acceptance and automated workflow that can be programmed to meet changes in the healthcare environment. The long-term multi-modality archive is being implemented in 3 or more phases with the first phase designed to provide a 12 to 18 month storage solution. This decision was made because the cost per GB of storage is rapidly decreasing and the speed at which data can be retrieved is increasing with time. The open-solution selected allows incorporation of leading edge, 'best of breed' hardware and software and provides maximum jukeboxes, provides maximum flexibility of workflow both within and outside of radiology. The selected solution is media independent, supports multiple jukeboxes, provides expandable storage capacity and will provide redundancy and fault tolerance at minimal cost. Some of the required attributes of the archive include scalable archive strategy, virtual image database with global query and object-oriented database. The selection process took approximately 10 months with Cemax-Icon being the vendor selected. Prior to signing a purchase order, Cemax-Icon performed a site survey, agreed upon the acceptance test protocol and provided a written guarantee of connectivity between their archive and the imaging modalities and other MICAS components.

  16. Managing an Archive of Images

    NASA Technical Reports Server (NTRS)

    Andres, Vince; Walter, David; Hallal, Charles; Jones, Helene; Callac, Chris

    2004-01-01

    The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by today s standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional, film-based camera, along with metadata about each image. After a "shoot," a photographer downloads the images into the database. Members of the PAO can use a Web-based application to search, view and retrieve images, approve images for publication, and view and edit metadata associated with the images. Approved images are archived and cross-referenced with appropriate descriptions and information. Security is provided by allowing administrators to explicitly grant access privileges to personnel to only access components of the system that they need to (i.e., allow only photographers to upload images, only PAO designated employees may approve images).

  17. Satellite Remote Sensing is Key to Water Cycle Integrator

    NASA Astrophysics Data System (ADS)

    Koike, T.

    2016-12-01

    To promote effective multi-sectoral, interdisciplinary collaboration based on coordinated and integrated efforts, the Global Earth Observation System of Systems (GEOSS) is now developing a "GEOSS Water Cycle Integrator (WCI)", which integrates "Earth observations", "modeling", "data and information", "management systems" and "education systems". GEOSS/WCI sets up "work benches" by which partners can share data, information and applications in an interoperable way, exchange knowledge and experiences, deepen mutual understanding and work together effectively to ultimately respond to issues of both mitigation and adaptation. (A work bench is a virtual geographical or phenomenological space where experts and managers collaborate to use information to address a problem within that space). GEOSS/WCI enhances the coordination of efforts to strengthen individual, institutional and infrastructure capacities, especially for effective interdisciplinary coordination and integration. GEOSS/WCI archives various satellite data to provide various hydrological information such as cloud, rainfall, soil moisture, or land-surface snow. These satellite products were validated using land observation in-situ data. Water cycle models can be developed by coupling in-situ and satellite data. River flows and other hydrological parameters can be simulated and validated by in-situ data. Model outputs from weather-prediction, seasonal-prediction, and climate-prediction models are archived. Some of these model outputs are archived on an online basis, but other models, e.g., climate-prediction models are archived on an offline basis. After models are evaluated and biases corrected, the outputs can be used as inputs into the hydrological models for predicting the hydrological parameters. Additionally, we have already developed a data-assimilation system by combining satellite data and the models. This system can improve our capability to predict hydrological phenomena. The WCI can provide better predictions of the hydrological parameters for integrated water resources management (IWRM) and also assess the impact of climate change and calculate adaptation needs.

  18. Earth Science Data Archive and Access at the NASA/Goddard Space Flight Center Distributed Active Archive Center (DAAC)

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory

    1999-01-01

    The Goddard Distributed Active Archive Center (DAAC), as an integral part of the Earth Observing System Data and Information System (EOSDIS), is the official source of data for several important earth remote sensing missions. These include the Sea-viewing Wide-Field-of-view Sensor (SeaWiFS) launched in August 1997, the Tropical Rainfall Measuring Mission (TRMM) launched in November 1997, and the Moderate Resolution Imaging Spectroradiometer (MODIS) scheduled for launch in mid 1999 as part of the EOS AM-1 instrumentation package. The data generated from these missions supports a host of users in the hydrological, land biosphere and oceanographic research and applications communities. The volume and nature of the data present unique challenges to an Earth science data archive and distribution system such as the DAAC. The DAAC system receives, archives and distributes a large number of standard data products on a daily basis, including data files that have been reprocessed with updated calibration data or improved analytical algorithms. A World Wide Web interface is provided allowing interactive data selection and automatic data subscriptions as distribution options. The DAAC also creates customized and value-added data products, which allow additional user flexibility and reduced data volume. Another significant part of our overall mission is to provide ancillary data support services and archive support for worldwide field campaigns designed to validate the results from the various satellite-derived measurements. In addition to direct data services, accompanying documentation, WWW links to related resources, support for EOSDIS data formats, and informed response to inquiries are routinely provided to users. The current GDAAC WWW search and order system is being restructured to provide users with a simplified, hierarchical access to data. Data Browsers have been developed for several data sets to aid users in ordering data. These Browsers allow users to specify spatial, temporal, and other parameter criteria in searching for and previewing data.

  19. Imaged Document Optical Correlation and Conversion System (IDOCCS)

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-03-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). In addition, many organizations are converting their paper archives to electronic images, which are stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources. The Imaged Document Optical Correlation and Conversion System (IDOCCS) provides a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval capability of document images. The IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and can even determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo, or documents with a particular individual's signature block, can be singled out. With this dual capability, IDOCCS outperforms systems that rely on optical character recognition as a basis for indexing and storing only the textual content of documents for later retrieval.

  20. GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.

    2012-12-01

    Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.

  1. Operating tool for a distributed data and information management system

    NASA Astrophysics Data System (ADS)

    Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.

    2002-07-01

    The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.

  2. Recovery and archiving key Arctic Alaska vegetation map and plot data for the Arctic-Boreal Vulnerability Field Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Breen, A. L.; Broderson, D.; Epstein, H. E.; Fisher, W.; Grunblatt, J.; Heinrichs, T.; Raynolds, M. K.; Walker, M. D.; Wirth, L.

    2013-12-01

    Abundant ground-based information will be needed to inform remote-sensing and modeling studies of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE). A large body of plot and map data collected by the Alaska Geobotany Center (AGC) and collaborators from the Arctic regions of Alaska and the circumpolar Arctic over the past several decades is being archived and made accessible to scientists and the public via the Geographic Information Network of Alaska's (GINA's) 'Catalog' display and portal system. We are building two main types of data archives: Vegetation Plot Archive: For the plot information we use a Turboveg database to construct the Alaska portion of the international Arctic Vegetation Archive (AVA) http://www.geobotany.uaf.edu/ava/. High quality plot data and non-digital legacy datasets in danger of being lost have highest priority for entry into the archive. A key aspect of the database is the PanArctic Species List (PASL-1), developed specifically for the AVA to provide a standard of species nomenclature for the entire Arctic biome. A wide variety of reports, documents, and ancillary data are linked to each plot's geographic location. Geoecological Map Archive: This database includes maps and remote sensing products and links to other relevant data associated with the maps, mainly those produced by the Alaska Geobotany Center. Map data include GIS shape files of vegetation, land-cover, soils, landforms and other categorical variables and digital raster data of elevation, multispectral satellite-derived data, and data products and metadata associated with these. The map archive will contain all the information that is currently in the hierarchical Toolik-Arctic Geobotanical Atlas (T-AGA) in Alaska http://www.arcticatlas.org, plus several additions that are in the process of development and will be combined with GINA's already substantial holdings of spatial data from northern Alaska. The Geoecological Atlas Portal uses GINA's Catalog tool to develop a web interface to view and access the plot and map data. The mapping portal allows visualization of GIS data, sample-point locations and imagery and access to the map data. Catalog facilitates the discovery and dissemination of science-based information products in support of analysis and decision-making concerned with development and climate change and is currently used by GINA in several similar archive/distribution portals.

  3. Use of MCIDAS as an earth science information systems tool

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Karitani, Shogo; Parker, Karen G.; Stooksbury, Laura M.; Wilson, Gregory S.

    1988-01-01

    The application of the man computer interactive data access system (MCIDAS) to information processing is examined. The computer systems that interface with the MCIDAS are discussed. Consideration is given to the computer networking of MCIDAS, data base archival, and the collection and distribution of real-time special sensor microwave/imager data.

  4. The Challenges Facing Science Data Archiving on Current Mass Storage Systems

    NASA Technical Reports Server (NTRS)

    Peavey, Bernard; Behnke, Jeanne (Editor)

    1996-01-01

    This paper discusses the desired characteristics of a tape-based petabyte science data archive and retrieval system required to store and distribute several terabytes (TB) of data per day over an extended period of time, probably more than 115 years, in support of programs such as the Earth Observing System Data and Information System (EOSDIS). These characteristics take into consideration not only cost effective and affordable storage capacity, but also rapid access to selected files, and reading rates that are needed to satisfy thousands of retrieval transactions per day. It seems that where rapid random access to files is not crucial, the tape medium, magnetic or optical, continues to offer cost effective data storage and retrieval solutions, and is likely to do so for many years to come. However, in environments like EOS these tape based archive solutions provide less than full user satisfaction. Therefore, the objective of this paper is to describe the performance and operational enhancements that need to be made to the current tape based archival systems in order to achieve greater acceptance by the EOS and similar user communities.

  5. NASA'S Earth Science Data Stewardship Activities

    NASA Technical Reports Server (NTRS)

    Lowe, Dawn R.; Murphy, Kevin J.; Ramapriyan, Hampapuram

    2015-01-01

    NASA has been collecting Earth observation data for over 50 years using instruments on board satellites, aircraft and ground-based systems. With the inception of the Earth Observing System (EOS) Program in 1990, NASA established the Earth Science Data and Information System (ESDIS) Project and initiated development of the Earth Observing System Data and Information System (EOSDIS). A set of Distributed Active Archive Centers (DAACs) was established at locations based on science discipline expertise. Today, EOSDIS consists of 12 DAACs and 12 Science Investigator-led Processing Systems (SIPS), processing data from the EOS missions, as well as the Suomi National Polar Orbiting Partnership mission, and other satellite and airborne missions. The DAACs archive and distribute the vast majority of data from NASA’s Earth science missions, with data holdings exceeding 12 petabytes The data held by EOSDIS are available to all users consistent with NASA’s free and open data policy, which has been in effect since 1990. The EOSDIS archives consist of raw instrument data counts (level 0 data), as well as higher level standard products (e.g., geophysical parameters, products mapped to standard spatio-temporal grids, results of Earth system models using multi-instrument observations, and long time series of Earth System Data Records resulting from multiple satellite observations of a given type of phenomenon). EOSDIS data stewardship responsibilities include ensuring that the data and information content are reliable, of high quality, easily accessible, and usable for as long as they are considered to be of value.

  6. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.

  7. 75 FR 1566 - National Industrial Security Program Directive No. 1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office 32 CFR Part...: Information Security Oversight Office, NARA. ACTION: Proposed rule; correction. SUMMARY: This document... Management System (FDMS) number to the proposed rule for Information Security Oversight Office (ISOO...

  8. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  9. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  10. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...

  11. How NASA is Building a Petabyte Scale Geospatial Archive in the Cloud

    NASA Technical Reports Server (NTRS)

    Pilone, Dan; Quinn, Patrick; Jazayeri, Alireza; Baynes, Kathleen; Murphy, Kevin J.

    2018-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) is working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This free and open source system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk outlines the motivation for this work, present the achievements and hurdles of the past 18 months and charts a course for the future expansion of Cumulus. We explore not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud. The NASA EOSDIS archive is currently at nearly 30 PBs and will grow to over 300PBs in the coming years. We've presented progress on this effort at AWS re:Invent and the American Geophysical Union (AGU) Fall Meeting in 2017 and hope to have the opportunity to share with FOSS4G attendees information on the availability of the open sourced software and how NASA intends on making its Earth Observing Geospatial data available for free to the public in the cloud.

  12. Lessons Learned While Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Technical Reports Server (NTRS)

    Pilone, Dan; Mclaughlin, Brett; Plofchan, Peter

    2017-01-01

    NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products. Reviewed and approved by Chris Lynnes.

  13. Picture archiving and computing systems: the key to enterprise digital imaging.

    PubMed

    Krohn, Richard

    2002-09-01

    The utopian view of the electronic medical record includes the digital transformation of all aspects of patient information. Historically, imagery from the radiology, cardiology, ophthalmology, and pathology departments, as well as the emergency room, has been a morass of paper, film, and other media, isolated within each department's system architecture. In answer to this dilemma, picture archiving and computing systems have become the focal point of efforts to create a single platform for the collection, storage, and distribution of clinical imagery throughout the health care enterprise.

  14. Implementation of a filmless mini picture archiving and communication system in ultrasonography: experience after one year of use.

    PubMed

    Henri, C J; Cox, R D; Bret, P M

    1997-08-01

    This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).

  15. Development of the Subaru-Mitaka-Okayama-Kiso Archive System

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatoshi; Watanabe, Masaru; Ozawa, Tomohiko; Hamabe, Masaru

    We have developed the Subaru-Mitaka-Okayama-Kiso-Archive (SMOKA) public science archive system which provides access to the data of the Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory/University of Tokyo. SMOKA is the successor of the MOKA3 system. The user can browse the Quick-Look Images, Header Information (HDI) and the ASCII Table Extension (ATE) of each frame from the search result table. A request for data can be submitted in a simple manner. The system is developed with Java Servlet for the back-end, and Java Server Pages (JSP) for content display. The advantage of JSP's is the separation of the front-end presentation from the middle- and back-end tiers which led to an efficient development of the system. The SMOKA homepage is available at SMOKA

  16. Stewardship of NASA's Earth Science Data and Ensuring Long-Term Active Archives

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H.; Behnke, J.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since 1994. EOSDIS manages data from pre-EOS missions dating back to 1960s, EOS missions that started in 1997, and missions from the post-EOS era. Its data holdings come from many different sources - satellite and airborne instruments, in situ measures, field experiments, science investigations, etc. Since the beginning of the EOS Program, NASA has followed an open data policy, with non-discriminatory access to data with no period of exclusive access. NASA has well-established processes for assigning and/or accepting datasets into one of 12 Distributed Active Archive Centers (DAACs) that are parts of EOSDIS. EOSDIS has been evolving through several information technology cycles, adapting to hardware and software changes in the commercial sector. NASA is responsible for maintaining Earth science data as long as users are interested in using them for research and applications, which is well beyond the life of the data gathering missions. For science data to remain useful over long periods of time, steps must be taken to preserve: 1. Data bits with no corruption, 2. Discoverability and access, 3. Readability, 4. Understandability, 5. Usability and 6. Reproducibility of results. NASA's Earth Science data and Information System (ESDIS) Project, along with the 12 EOSDIS Distributed Active Archive Centers (DAACs), has made significant progress in each of these areas over the last decade, and continues to evolve its active archive capabilities. Particular attention is being paid in recent years to ensure that the datasets are "published" in an easily accessible and citable manner through a unified metadata model, a common metadata repository (CMR), a coherent view through the earthdata.gov website, and assignment of Digital Object Identifiers (DOI) with well-designed landing/product information pages.

  17. Applying the Technology Acceptance Model in a Study of the Factors Affecting Usage of the Taiwan Digital Archives System

    ERIC Educational Resources Information Center

    Hong, Jon-Chao; Hwang, Ming-Yueh; Hsu, Hsuan-Fang; Wong, Wan-Tzu; Chen, Mei-Yung

    2011-01-01

    The rapid development of information and communication technology and the popularization of the Internet have given a boost to digitization technologies. Since 2001, The National Science Council (NSC) of Taiwan has invested a large amount of funding in the National Digital Archives Program (NDAP) to develop digital content. Some studies have…

  18. Abstracts of SIG Sessions.

    ERIC Educational Resources Information Center

    Proceedings of the ASIS Annual Meeting, 1996

    1996-01-01

    Includes abstracts of special interest group (SIG) sessions. Highlights include digital imagery; text summarization; browsing; digital libraries; icons and the Web; information management; curricula planning; interfaces; information systems; theories; scholarly and scientific communication; global development; archives; document delivery;…

  19. Information Retrieval Using ADABAS-NATURAL (with Applications for Television and Radio).

    ERIC Educational Resources Information Center

    Silbergeld, I.; Kutok, P.

    1984-01-01

    Describes use of the software ADABAS (general purpose database management system) and NATURAL (interactive programing language) in development and implementation of an information retrieval system for the National Television and Radio Network of Israel. General design considerations, files contained in each archive, search strategies, and keywords…

  20. Measuring the Performance of Document Supply Systems.

    ERIC Educational Resources Information Center

    Line, Maurice B.

    Produced by Unesco as part of its program designed to help member states develop national information systems, including libraries, information services, and archives, this manual is a guide to document supply measurement techniques that are applicable to a wide range of countries. The first of seven chapters considers the objectives, nature, and…

  1. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  2. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  3. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  4. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  5. 32 CFR 2001.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...

  6. Science information systems: Archive, access, and retrieval

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1991-01-01

    The objective of this research is to develop technology for the automated characterization and interactive retrieval and visualization of very large, complex scientific data sets. Technologies will be developed for the following specific areas: (1) rapidly archiving data sets; (2) automatically characterizing and labeling data in near real-time; (3) providing users with the ability to browse contents of databases efficiently and effectively; (4) providing users with the ability to access and retrieve system independent data sets electronically; and (5) automatically alerting scientists to anomalies detected in data.

  7. infoRAD: computers for clinical practice and education in radiology. Teleradiology, information transfer, and PACS: implications for diagnostic imaging in the 1990s.

    PubMed

    Schilling, R B

    1993-05-01

    Picture archiving and communication systems (PACS) provide image viewing at diagnostic, reporting, consultation, and remote workstations; archival on magnetic or optical media by means of short- or long-term storage devices; communications by means of local or wide area networks or public communication services; and integrated systems with modality interfaces and gateways to health care facilities and departmental information systems. Research indicates three basic needs for image and report management: (a) improved communication and turnaround time between radiologists and other imaging specialists and referring physicians, (b) fast reliable access to both current and previously obtained images and reports, and (c) space-efficient archival support. Although PACS considerations are much more complex than those associated with single modalities, the same basic purchase criteria apply. These criteria include technical leadership, image quality, throughput, life cost (eg, initial cost, maintenance, upgrades, and depreciation), and total service. Because a PACS takes much longer to implement than a single modality, the customer and manufacturer must develop a closer working relationship than has been necessary in the past.

  8. Global Change Data Center: Mission, Organization, Major Activities, and 2001 Highlights

    NASA Technical Reports Server (NTRS)

    Wharton, Stephen W. (Technical Monitor)

    2002-01-01

    Rapid efficient access to Earth sciences data is fundamental to the Nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase further and missions with constellations of satellites start to appear. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink. The Global Change Data Center's (GCDC) mission is to provide systems, data products, and information management services to maximize the availability and utility of NASA's Earth science data. The specific objectives are (1) support Earth science missions be developing and operating systems to generate, archive, and distribute data products and information; (2) develop innovative information systems for processing, archiving, accessing, visualizing, and communicating Earth science data; and (3) develop value-added products and services to promote broader utilization of NASA Earth Sciences Enterprise (ESE) data and information. The ultimate product of GCDC activities is access to data and information to support research, education, and public policy.

  9. Initial Experience With A Prototype Storage System At The University Of North Carolina

    NASA Astrophysics Data System (ADS)

    Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.

    1986-06-01

    A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.

  10. Get It Right First Time: A Beginner's Guide to Document Management.

    ERIC Educational Resources Information Center

    Hayes, Mike

    1997-01-01

    Document management (DM) systems capture, store, index, retrieve, route, distribute, and archive information in organizations. Discusses "passive" electronic libraries and "active" systems; characteristics of effective systems; implementing a system; fitting a new system to an existing infrastructure; budgets; system…

  11. Clinical applications of an ATM/Ethernet network in departments of neuroradiology and radiotherapy.

    PubMed

    Cimino, C; Pizzi, R; Fusca, M; Bruzzone, M G; Casolino, D; Sicurello, F

    1997-01-01

    An integrated system for the multimedia management of images and clinical information has been developed at the Isituto Nazionale Neurologico C. Besta in Milan. The Institute physicians have the daily need of consulting images coming from various modalities. The high volume of archived material and the need of retrieving and displaying new and past images and clinical information has motivated the development of a Picture Archiving and Communication System (PACS) for the automatic management of images and clinical data, related not only to the Radiology Department, but also to the Radiotherapy Department for 3D virtual simulation, to remote teleconsulting, and in the following to all the wards, ambulatories and labs.

  12. The Palladiolibrary Geo-Models AN Open 3d Archive to Manage and Visualize Information-Communication Resources about Palladio

    NASA Astrophysics Data System (ADS)

    Apollonio, F. I.; Baldissini, S.; Clini, P.; Gaiani, M.; Palestini, C.; Trevisan, C.

    2013-07-01

    The paper describes objectives, methods, procedures and outcomes of the development of the digital archive of Palladio works and documentation: the PALLADIOLibrary of Centro Internazionale di Studi di Architettura Andrea Palladio di Vicenza (CISAAP). The core of the application consists of fifty-one reality-based 3D models usable and navigable within a system grounded on GoogleEarth. This information system, a collaboration of four universities bearers of specific skills returns a comprehensive, structured and coherent semantic interpretation of Palladian landscape through shapes realistically reconstructed from historical sources and surveys and treated for GE with Ambient Occlusion techniques, overcoming the traditional display mode.

  13. Medical workstation design: enhancing graphical interface with 3D anatomical atlas

    NASA Astrophysics Data System (ADS)

    Hoo, Kent S., Jr.; Wong, Stephen T. C.; Grant, Ellen

    1997-05-01

    The huge data archive of the UCSF Hospital Integrated Picture Archiving and Communication System gives healthcare providers access to diverse kinds of images and text for diagnosis and patient management. Given the mass of information accessible, however, conventional graphical user interface (GUI) approach overwhelms the user with forms, menus, fields, lists, and other widgets and causes 'information overloading.' This article describes a new approach that complements the conventional GUI with 3D anatomical atlases and presents the usefulness of this approach with a clinical neuroimaging application.

  14. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  15. 76 FR 46855 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-03

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... original archival records in a National Archives and Records Administration facility. The public is invited...

  16. [From planning to realization of an electronic patient record].

    PubMed

    Krämer, T; Rapp, R; Krämer, K L

    1999-03-01

    The high complex requirements on information and information flow in todays hospitals can only be accomplished by the use of modern Information Systems (IS). In order to achieve this, the Stiftung Orthopädische Universitätsklinik has carried out first the Project "Strategic Informations System Planning" in 1993. Then realizing the necessary infrastructure (network; client-server) from 1993 to 1997, and finally started the introduction of modern IS (SAP R/3 and IXOS-Archive) in the clinical area. One of the approved goal was the replacement of the paper medical record by an up-to-date electronical medical record. In this article the following three topics will be discussed: the difference between the up-to-date electronical medical record and the electronically archived finished cases, steps performed by our clinic to realize the up-to-date electronical medical record and the problems occurred during this process.

  17. From planning to realisation of an electronic patient record.

    PubMed

    Krämer, T; Rapp, R; Krämer, K-L

    1999-03-01

    The high complex requirements on information and information flow in todays hospitals can only be accomplished by the use of modern Information Systems (IS). In order to achieve this, the Stiftung Orthopädische Universitätsklinik has carried out first the Project "Strategic Informations System Planning" in 1993. Then realizing the neccessary infrastructure (network; client-server) from 1993 to 1997, and finally started the introduction of modern IS (SAP R/3 and IXOS-Archive) in the clinical area. One of the approved goal was the replacement of the paper medical record by an up-to-date electronical medical record. In this article the following three topics will be discussed: the difference between the up-to-date electronical medical record and the electronically archived finished cases, steps performed by our clinic to realize the up-to-date electronical medical record and the problems occured during this process.

  18. Collaborative Preservation of At-Risk Data at NOAA's National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Casey, K. S.; Collins, D.; Cooper, J. M.; Ritchey, N. A.

    2017-12-01

    The National Centers for Environmental Information (NCEI) serves as the official long term archive of NOAA's environmental data. Adhering to the principles and responsibilities of the Open Archival Information System (OAIS, ISO 14721), and backed by both agency policies and formal legislation, NCEI ensures that these irreplaceable environmental data are preserved and made available for current users and future generations. These goals are achieved through regional, national, and international collaborative efforts like the ICSU World Data System, the Intergovernmental Oceanographic Commission's International Oceanographic Data and Information Exchange (IODE) program, NSF's DataOne, and through specific data preservation projects with partners such as the NOAA Cooperative Institutes, ESIP, and even retired federal employees. Through efforts like these, at-risk data with poor documentation, on aging media, and of unknown format and content are being rescued and made available to the public for widespread reuse.

  19. Integrated information systems for translational medicine.

    PubMed

    Winter, A; Funkat, G; Haeber, A; Mauz-Koerholz, C; Pommerening, K; Smers, S; Stausberg, J

    2007-01-01

    Translational medicine research needs a two-way information highway between 'bedside' and 'bench'. Unfortunately there are still weak links between successfully integrated information roads for bench, i.e. research networks, and bedside, i.e. regional or national health information systems. The question arises, what measures have to be taken to overcome the deficiencies. It is examined how patient care-related costs of clinical research can be separated and shared by health insurances, whether quality of patient care data is sufficient for research, how patient identity can be maintained without conflict to privacy, how care and research records can be archived, and how information systems for care and research can be integrated. Since clinical trials improve quality of care, insurers share parts of the costs. Quality of care data has to be improved by introducing minimum basic data sets. Pseudonymization solves the conflict between needs for patient identity and privacy. Archiving patient care records and research records is similar and XML and CDISC can be used. Principles of networking infrastructures for care and research still differ. They have to be bridged first and harmonized later. To link information systems for care (bed) and for research (bench) needs technical infrastructures as well as economic and organizational regulations.

  20. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop

    PubMed Central

    Sali, Andrej; Berman, Helen M.; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K.; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M.J.J.; Chiu, Wah; Dal Peraro, Matteo; Di Maio, Frank; Ferrin, Thomas E.; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L.; Meiler, Jens; Marti-Renom, Marc A.; Montelione, Gaetano T.; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J.; Saibil, Helen; Schröder, Gunnar F.; Schwieters, Charles D.; Seidel, Claus A. M.; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L.; Velankar, Sameer; Westbrook, John D.

    2016-01-01

    Summary Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? PMID:26095030

  1. NASA's SPICE System Models the Solar System

    NASA Technical Reports Server (NTRS)

    Acton, Charles

    1996-01-01

    SPICE is NASA's multimission, multidiscipline information system for assembling, distributing, archiving, and accessing space science geometry and related data used by scientists and engineers for mission design and mission evaluation, detailed observation planning, mission operations, and science data analysis.

  2. 7 CFR 1755.901 - Incorporation by Reference.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Digital Systems and Networks, Transmission media characteristics—Optical fibre cables, Characteristics of... Systems and Media, Digital Systems and Networks, Transmission media characteristics—Optical fibre cables... National Archives and Records Administration (NARA). For information on the availability of these materials...

  3. Government Information Quarterly. Volume 7, no. 2: National Aeronautics and Space Administration Scientific and Technical Information Programs. Special issue

    NASA Technical Reports Server (NTRS)

    Hernon, Peter (Editor); Mcclure, Charles R. (Editor); Pinelli, Thomas E. (Editor)

    1990-01-01

    NASA scientific and technical information (STI) programs are discussed. Topics include management of information in a research and development agency, the new space and Earth science information systems at NASA's archive, scientific and technical information management, and technology transfer of NASA aerospace technology to other industries.

  4. 78 FR 78401 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2014-012] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA... biomedical statistical research in archival records containing highly personal information. The second is...

  5. Document Management in Local Government.

    ERIC Educational Resources Information Center

    Williams, Bernard J. S.

    1998-01-01

    The latest in electronic document management in British local government is discussed. Finance, revenues, and benefits systems of leading vendors to local authorities are highlighted. A planning decisions archive management system and other information services are discussed. (AEF)

  6. 75 FR 69474 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... research in archival records containing highly personal information. The second is an application that is...

  7. 76 FR 4737 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... concerning the following information collections: 1. Title: Statistical Research in Archival Records...

  8. 78 FR 50451 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2013-041] Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration... following information collection: Title: National Archives and Records Administration Training and Event...

  9. STScI Archive Manual, Version 7.0

    NASA Astrophysics Data System (ADS)

    Padovani, Paolo

    1999-06-01

    The STScI Archive Manual provides information a user needs to know to access the HST archive via its two user interfaces: StarView and a World Wide Web (WWW) interface. It provides descriptions of the StarView screens used to access information in the database and the format of that information, and introduces the use to the WWW interface. Using the two interfaces, users can search for observations, preview public data, and retrieve data from the archive. Using StarView one can also find calibration reference files and perform detailed association searches. With the WWW interface archive users can access, and obtain information on, all Multimission Archive at Space Telescope (MAST) data, a collection of mainly optical and ultraviolet datasets which include, amongst others, the International Ultraviolet Explorer (IUE) Final Archive. Both interfaces feature a name resolver which simplifies searches based on target name.

  10. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  11. TRMM Data Mining Service at the Goddard Earth Sciences (GES) DISC DAAC Tropical Rainfall Measuring Mission (TRMM)

    NASA Technical Reports Server (NTRS)

    2002-01-01

    TRMM has acquired more than four years of data since its launch in November 1997. All TRMM standard products are processed by the TRMM Science Data and Information System (TSDIS) and archived and distributed to general users by the GES DAAC. Table 1 shows the total archive and distribution as of February 28, 2002. The Utilization Ratio (UR), defined as the ratio of the number of distributed files to the number of archived files, of the TRMM standard products has been steadily increasing since 1998 and is currently at 6.98.

  12. National Media Laboratory media testing results

    NASA Technical Reports Server (NTRS)

    Mularie, William

    1993-01-01

    The government faces a crisis in data storage, analysis, archive, and communication. The sheer quantity of data being poured into the government systems on a daily basis is overwhelming systems ability to capture, analyze, disseminate, and store critical information. Future systems requirements are even more formidable: with single government platforms having data rate of over 1 Gbit/sec, greater than Terabyte/day storage requirements, and with expected data archive lifetimes of over 10 years. The charter of the National Media Laboratory (NML) is to focus the resources of industry, government, and academia on government needs in the evaluation, development, and field support of advanced recording systems.

  13. Archiving InSight Lander Science Data Using PDS4 Standards

    NASA Astrophysics Data System (ADS)

    Stein, T.; Guinness, E. A.; Slavney, S.

    2017-12-01

    The InSight Mars Lander is scheduled for launch in 2018, and science data from the mission will be archived in the NASA Planetary Data System (PDS) using the new PDS4 standards. InSight is a geophysical lander with a science payload that includes a seismometer, a probe to measure subsurface temperatures and heat flow, a suite of meteorology instruments, a magnetometer, an experiment using radio tracking, and a robotic arm that will provide soil physical property information based on interactions with the surface. InSight is not the first science mission to archive its data using PDS4. However, PDS4 archives do not currently contain examples of the kinds of data that several of the InSight instruments will produce. Whereas the existing common PDS4 standards were sufficient for most of archiving requirements of InSight, the data generated by a few instruments required development of several extensions to the PDS4 information model. For example, the seismometer will deliver a version of its data in SEED format, which is standard for the terrestrial seismology community. This format required the design of a new product type in the PDS4 information model. A local data dictionary has also been developed for InSight that contains attributes that are not part of the common PDS4 dictionary. The local dictionary provides metadata relevant to all InSight data sets, and attributes specific to several of the instruments. Additional classes and attributes were designed for the existing PDS4 geometry dictionary that will capture metadata for the lander position and orientation, along with camera models for stereo image processing. Much of the InSight archive planning and design work has been done by a Data Archiving Working Group (DAWG), which has members from the InSight project and the PDS. The group coordinates archive design, schedules and peer review of the archive documentation and test products. The InSight DAWG archiving effort for PDS is being led by the PDS Geosciences Node with several other nodes working one-on-one with instruments relevant to their disciplines. Once the InSight mission begins operations, the DAWG will continue to provide oversight on release of InSight data to PDS. Lessons learned from InSight archive work will also feed forward to planning the archives for the Mars 2020 rover.

  14. Astro-WISE: Chaining to the Universe

    NASA Astrophysics Data System (ADS)

    Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.

    2007-10-01

    The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.

  15. GeoCrystal: graphic-interactive access to geodata archives

    NASA Astrophysics Data System (ADS)

    Goebel, Stefan; Haist, Joerg; Jasnoch, Uwe

    2002-03-01

    Recently there is spent a lot of effort to establish information systems and global infrastructures enabling both data suppliers and users to describe (-> eCommerce, metadata) as well as to find appropriate data. Examples for this are metadata information systems, online-shops or portals for geodata. The main disadvantages of existing approaches are insufficient methods and mechanisms leading users to (e.g. spatial) data archives. This affects aspects concerning usability and personalization in general as well as visual feedback techniques in the different steps of the information retrieval process. Several approaches aim at the improvement of graphical user interfaces by using intuitive metaphors, but only some of them offer 3D interfaces in the form of information landscapes or geographic result scenes in the context of information systems for geodata. This paper presents GeoCrystal, which basic idea is to adopt Venn diagrams to compose complex queries and to visualize search results in a 3D information and navigation space for geodata. These concepts are enhanced with spatial metaphors and 3D information landscapes (library for geodata) wherein users can specify searches for appropriate geodata and are enabled to graphic-interactively communicate with search results (book metaphor).

  16. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    NASA Astrophysics Data System (ADS)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  17. Information systems requirements for the Microgravity Science and Applications Program

    NASA Technical Reports Server (NTRS)

    Kicza, M. E.; Kreer, J. R.

    1991-01-01

    NASA's Microgravity Science and Applications (MSAD) Program is presented. Additionally, the types of information produced wiithin the program and the anticipated growth in information system requirements as the program transitions to Space Station Freedom utilization are discussed. Plans for payload operations support in the Freedom era are addressed, as well as current activities to define research community requirements for data and sample archives.

  18. Information systems requirements for the microgravity science and applications program

    NASA Technical Reports Server (NTRS)

    Kicza, M. E.; Kreer, J. R.

    1990-01-01

    NASA's Microgravity Science and Applications (MSAD) Program is presented. Additionally, the types of information produced within the program and the anticipated growth in information system requirements as the program transitions to Space Station Freedom utilization are discussed. Plans for payload operations support in the Freedom era are addressed, as well as current activities to define research community requirements for data and sample archives.

  19. Electronic signatures for long-lasting storage purposes in electronic archives.

    PubMed

    Pharow, Peter; Blobel, Bernd

    2005-03-01

    Communication and co-operation in healthcare and welfare require a certain set of trusted third party (TTP) services describing both status and relation of communicating principals as well as their corresponding keys and attributes. Additional TTP services are needed to provide trustworthy information about dynamic issues of communication and co-operation such as time and location of processes, workflow relations, and system behaviour. Legal and ethical requirements demand securely stored patient information and well-defined access rights. Among others, electronic signatures based on asymmetric cryptography are important means for securing the integrity of a message or file as well as for accountability purposes including non-repudiation of both origin and receipt. Electronic signatures along with certified time stamps or time signatures are especially important for electronic archives in general, electronic health records (EHR) in particular, and especially for typical purposes of long-lasting storage. Apart from technical storage problems (e.g. lifetime of the storage devices, interoperability of retrieval and presentation software), this paper identifies mechanisms of e.g. re-signing and re-stamping of data items, files, messages, sets of archived items or documents, archive structures, and even whole archives.

  20. Records & Information Management Services | Alaska State Archives

    Science.gov Websites

    Search Search in: Archives State of Alaska Home About Records Management (RIMS) For Researchers Collections Imaging (IMS) ASHRAB Libraries, Archives, & Museums Archives Records Management (RIMS) Records records and information management for the State of Alaska. Frequently Asked Questions Submit Records

  1. Restoration of Apollo Data by the Lunar Data Project/PDS Lunar Data Node: An Update

    NASA Technical Reports Server (NTRS)

    Williams, David R.; Hills, H. Kent; Taylor, Patrick T.; Grayzeck, Edwin J.; Guinness, Edward A.

    2016-01-01

    The Apollo 11, 12, and 14 through 17 missions orbited and landed on the Moon, carrying scientific instruments that returned data from all phases of the missions, included long-lived Apollo Lunar Surface Experiments Packages (ALSEPs) deployed by the astronauts on the lunar surface. Much of these data were never archived, and some of the archived data were on media and in formats that are outmoded, or were deposited with little or no useful documentation to aid outside users. This is particularly true of the ALSEP data returned autonomously for many years after the Apollo missions ended. The purpose of the Lunar Data Project and the Planetary Data System (PDS) Lunar Data Node is to take data collections already archived at the NASA Space Science Data Coordinated Archive (NSSDCA) and prepare them for archiving through PDS, and to locate lunar data that were never archived, bring them into NSSDCA, and then archive them through PDS. Preparing these data for archiving involves reading the data from the original media, be it magnetic tape, microfilm, microfiche, or hard-copy document, converting the outmoded, often binary, formats when necessary, putting them into a standard digital form accepted by PDS, collecting the necessary ancillary data and documentation (metadata) to ensure that the data are usable and well-described, summarizing the metadata in documentation to be included in the data set, adding other information such as references, mission and instrument descriptions, contact information, and related documentation, and packaging the results in a PDS-compliant data set. The data set is then validated and reviewed by a group of external scientists as part of the PDS final archive process. We present a status report on some of the data sets that we are processing.

  2. 32 CFR 2001.48 - Loss, possible compromise or unauthorized disclosure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... governments normally will not be advised of any security system vulnerabilities that contributed to the... INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Safeguarding § 2001.48 Loss, possible compromise or unauthorized disclosure. (a...

  3. GEOSPATICAL INFORMATION TECHNOLOGY AND INFORMATION MANAGEMENT QUALITY ASSURANCE

    EPA Science Inventory

    Most of the geospatial data in use are originated electronically. As a result, these data are acquired, stored, transformed, processed, presented, and archived electronically. The organized system of computer hardware and software used in these processes is called an Informatio...

  4. The MSFC Systems Engineering Guide: An Overview and Plan

    NASA Technical Reports Server (NTRS)

    Shelby, Jerry; Thomas, L. Dale

    2007-01-01

    This paper describes the guiding vision, progress to date and the plan forward for development of the Marshall Space Flight Center (MSFC) Systems Engineering Guide (SEG), a virtual systems engineering handbook and archive that describes the system engineering processes used by MSFC in the development of ongoing complex space systems such as the Ares launch vehicle and forthcoming ones as well. It is the intent of this website to be a "One Stop Shop' for MSFC systems engineers that will provide tutorial information, an overview of processes and procedures and links to assist system engineering with guidance and references, and provide an archive of relevant systems engineering artifacts produced by the many NASA projects developed and managed by MSFC over the years.

  5. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  6. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    PubMed

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  7. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop.

    PubMed

    Sali, Andrej; Berman, Helen M; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M J J; Chiu, Wah; Peraro, Matteo Dal; Di Maio, Frank; Ferrin, Thomas E; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L; Meiler, Jens; Marti-Renom, Marc A; Montelione, Gaetano T; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J; Saibil, Helen; Schröder, Gunnar F; Schwieters, Charles D; Seidel, Claus A M; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L; Velankar, Sameer; Westbrook, John D

    2015-07-07

    Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, on October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2015-01-01

    NASA's Earth Science Data Systems (ESDS) Program has evolved over the last two decades, and currently has several core and community components. Core components provide the basic operational capabilities to process, archive, manage and distribute data from NASA missions. Community components provide a path for peer-reviewed research in Earth Science Informatics to feed into the evolution of the core components. The Earth Observing System Data and Information System (EOSDIS) is a core component consisting of twelve Distributed Active Archive Centers (DAACs) and eight Science Investigator-led Processing Systems spread across the U.S. The presentation covers how the ESDS Program continues to evolve and benefits from as well as contributes to advances in Earth Science Informatics.

  9. 2016 update of the PRIDE database and its related tools

    PubMed Central

    Vizcaíno, Juan Antonio; Csordas, Attila; del-Toro, Noemi; Dianes, José A.; Griss, Johannes; Lavidas, Ilias; Mayer, Gerhard; Perez-Riverol, Yasset; Reisinger, Florian; Ternent, Tobias; Xu, Qing-Wei; Wang, Rui; Hermjakob, Henning

    2016-01-01

    The PRoteomics IDEntifications (PRIDE) database is one of the world-leading data repositories of mass spectrometry (MS)-based proteomics data. Since the beginning of 2014, PRIDE Archive (http://www.ebi.ac.uk/pride/archive/) is the new PRIDE archival system, replacing the original PRIDE database. Here we summarize the developments in PRIDE resources and related tools since the previous update manuscript in the Database Issue in 2013. PRIDE Archive constitutes a complete redevelopment of the original PRIDE, comprising a new storage backend, data submission system and web interface, among other components. PRIDE Archive supports the most-widely used PSI (Proteomics Standards Initiative) data standard formats (mzML and mzIdentML) and implements the data requirements and guidelines of the ProteomeXchange Consortium. The wide adoption of ProteomeXchange within the community has triggered an unprecedented increase in the number of submitted data sets (around 150 data sets per month). We outline some statistics on the current PRIDE Archive data contents. We also report on the status of the PRIDE related stand-alone tools: PRIDE Inspector, PRIDE Converter 2 and the ProteomeXchange submission tool. Finally, we will give a brief update on the resources under development ‘PRIDE Cluster’ and ‘PRIDE Proteomes’, which provide a complementary view and quality-scored information of the peptide and protein identification data available in PRIDE Archive. PMID:26527722

  10. St. Petersburg Coastal and Marine Science Center's Core Archive Portal

    USGS Publications Warehouse

    Reich, Chris; Streubert, Matt; Dwyer, Brendan; Godbout, Meg; Muslic, Adis; Umberger, Dan

    2012-01-01

    This Web site contains information on rock cores archived at the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC). Archived cores consist of 3- to 4-inch-diameter coral cores, 1- to 2-inch-diameter rock cores, and a few unlabeled loose coral and rock samples. This document - and specifically the archive Web site portal - is intended to be a 'living' document that will be updated continually as additional cores are collected and archived. This document may also contain future references and links to a catalog of sediment cores. Sediment cores will include vibracores, pushcores, and other loose sediment samples collected for research purposes. This document will: (1) serve as a database for locating core material currently archived at the USGS SPCMSC facility; (2) provide a protocol for entry of new core material into the archive system; and, (3) set the procedures necessary for checking out core material for scientific purposes. Core material may be loaned to other governmental agencies, academia, or non-governmental organizations at the discretion of the USGS SPCMSC curator.

  11. CDDIS Data Center Summary for the IVS 2012 Annual Report

    NASA Technical Reports Server (NTRS)

    Noll, Carey

    2013-01-01

    This report summarizes activities during 2012 and future plans of the Crustal Dynamics Data Information System (CDDIS) with respect to the International VLBI Service for Geodesy and Astrometry (IVS). Included in this report are background information about the CDDIS, the computer architecture, staff supporting the system, archive contents, and future plans for the CDDIS within the IVS.

  12. 75 FR 63141 - Information Collection; Research Data Archive Use Tracking

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ..., filing of petitions and applications and agency #0;statements of organization and functions are examples... Information Collection; Research Data Archive Use Tracking AGENCY: Forest Service, USDA. ACTION: Notice... information collection, Research Data Archive Use Tracking. DATES: Comments must be received in writing on or...

  13. OASIS: A Data Fusion System Optimized for Access to Distributed Archives

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Kong, M.; Good, J. C.

    2002-05-01

    The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.

  14. The NASA Distributed Active Archive Center Experience in Providing Trustworthy Digital Repositories

    NASA Astrophysics Data System (ADS)

    de Sherbinin, A. M.; Downs, R. R.; Chen, R. S.

    2017-12-01

    Since the early 1990s, NASA Earth Observation System Data and Information System (EOSDIS) has supported between 10 to 12 discipline-specific Distributed Active Archive Centers (DAACs) that have provided long-term preservation of Earth Science data records, particularly from satellite and airborne remote sensing. The focus of this presentation is on two of the DAACs - the Socioeconomic Data and Applications Center (SEDAC) and Oak Ridge National Laboratory (ORNL) DAAC - that provide archiving and dissemination of third party data sets. The presentation describes the community of interest for these two DAACs, their data management practices, and the benefits of certification to the DAACs and their user communities. It also describes the organizational, technical, financial, and legal challenges to providing trustworthy long-term data stewardship.

  15. [Electronic poison information management system].

    PubMed

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  16. 28 CFR 74.5 - Identification of eligible persons.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... establish an information system with names and other identifying information of potentially eligible individuals from the following sources: (1) Official sources: (i) The National Archives; (ii) The Department... libraries; (vi) State and local libraries; (vii) State and local historical societies; (viii) State and...

  17. Development of geotechnical analysis and design modules for the Virginia Department of Transportation's geotechnical database.

    DOT National Transportation Integrated Search

    2005-01-01

    In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...

  18. AVIRIS and TIMS data processing and distribution at the land processes distributed active archive center

    NASA Technical Reports Server (NTRS)

    Mah, G. R.; Myers, J.

    1993-01-01

    The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.

  19. Archive of chirp seismic reflection data collected during USGS cruises 00SCC02 and 00SCC04, Barataria Basin, Louisiana, May 12-31 and June 17-July 2, 2000

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, S.V.; Kindinger, J.L.; Flocks, J.G.; Wiese, D.S.; Kulp, Mark; Penland, Shea; Britsch, L.D.; Brooks, G.R.

    2003-01-01

    This archive consists of two-dimensional marine seismic reflection profile data collected in the Barataria Basin of southern Louisiana. These data were acquired in May, June, and July of 2000 aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper-Text Markup Language (HTML), shapefiles, and Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) information provided here is compatible with Environmental Systems Research Institute (ESRI) GIS software.

  20. Recognition techniques for extracting information from semistructured documents

    NASA Astrophysics Data System (ADS)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  1. SpaceOps 1992: Proceedings of the Second International Symposium on Ground Data Systems for Space Mission Operations

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Second International Symposium featured 135 oral presentations in these 12 categories: Future Missions and Operations; System-Level Architectures; Mission-Specific Systems; Mission and Science Planning and Sequencing; Mission Control; Operations Automation and Emerging Technologies; Data Acquisition; Navigation; Operations Support Services; Engineering Data Analysis of Space Vehicle and Ground Systems; Telemetry Processing, Mission Data Management, and Data Archiving; and Operations Management. Topics focused on improvements in the productivity, effectiveness, efficiency, and quality of mission operations, ground systems, and data acquisition. Also emphasized were accomplishments in management of human factors; use of information systems to improve data retrieval, reporting, and archiving; design and implementation of logistics support for mission operations; and the use of telescience and teleoperations.

  2. Problem of data quality and the limitations of the infrastructure approach

    NASA Astrophysics Data System (ADS)

    Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong

    1998-07-01

    The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.

  3. Directory of Unesco Information Services: Library, Archives, and Documentation Centres = Repertoire des Services d'information de l'Unesco: bibliotheque, archives et centres de documentation.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific and Cultural Organization, Paris (France). Div. of Unesco Information Services.

    Although primarily a directory of Unesco documentation centers and information units, this guide also provides information on the Main Library and the Unesco Archives. The listing for each of the nine centers includes information on any subdivisions of the center: (1) Bureau for Co-ordination of Operational Activities (BAO); (2) Culture and…

  4. Facilitating Navigation Through Large Archives

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Troung, Dat; Hodgson, Terry R.

    2005-01-01

    Automated Visual Access (AVA) is a computer program that effectively makes a large collection of information visible in a manner that enables a user to quickly and efficiently locate information resources, with minimal need for conventional keyword searches and perusal of complex hierarchical directory systems. AVA includes three key components: (1) a taxonomy that comprises a collection of words and phrases, clustered according to meaning, that are used to classify information resources; (2) a statistical indexing and scoring engine; and (3) a component that generates a graphical user interface that uses the scoring data to generate a visual map of resources and topics. The top level of an AVA display is a pictorial representation of an information archive. The user enters the depicted archive by either clicking on a depiction of subject area cluster, selecting a topic from a list, or entering a query into a text box. The resulting display enables the user to view candidate information entities at various levels of detail. Resources are grouped spatially by topic with greatest generality at the top layer and increasing detail with depth. The user can zoom in or out of specific sites or into greater or lesser content detail.

  5. Improving Access to NASA Earth Science Data through Collaborative Metadata Curation

    NASA Astrophysics Data System (ADS)

    Sisco, A. W.; Bugbee, K.; Shum, D.; Baynes, K.; Dixon, V.; Ramachandran, R.

    2017-12-01

    The NASA-developed Common Metadata Repository (CMR) is a high-performance metadata system that currently catalogs over 375 million Earth science metadata records. It serves as the authoritative metadata management system of NASA's Earth Observing System Data and Information System (EOSDIS), enabling NASA Earth science data to be discovered and accessed by a worldwide user community. The size of the EOSDIS data archive is steadily increasing, and the ability to manage and query this archive depends on the input of high quality metadata to the CMR. Metadata that does not provide adequate descriptive information diminishes the CMR's ability to effectively find and serve data to users. To address this issue, an innovative and collaborative review process is underway to systematically improve the completeness, consistency, and accuracy of metadata for approximately 7,000 data sets archived by NASA's twelve EOSDIS data centers, or Distributed Active Archive Centers (DAACs). The process involves automated and manual metadata assessment of both collection and granule records by a team of Earth science data specialists at NASA Marshall Space Flight Center. The team communicates results to DAAC personnel, who then make revisions and reingest improved metadata into the CMR. Implementation of this process relies on a network of interdisciplinary collaborators leveraging a variety of communication platforms and long-range planning strategies. Curating metadata at this scale and resolving metadata issues through community consensus improves the CMR's ability to serve current and future users and also introduces best practices for stewarding the next generation of Earth Observing System data. This presentation will detail the metadata curation process, its outcomes thus far, and also share the status of ongoing curation activities.

  6. Scientific and Technological Information in Transactional Files in Government Records and Archives: A RAMP Study.

    ERIC Educational Resources Information Center

    Wimalaratne, K. D. G.

    This long-term Records and Archives Administration Programme (RAMP) study is designed to assist archivists, records managers, and information specialists in identifying for current use and possible archival selection those transactional or case files that contain scientific and technical information (STI), particularly in those instances where…

  7. THE NEW ONLINE METADATA EDITOR FOR GENERATING STRUCTURED METADATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devarakonda, Ranjeet; Shrestha, Biva; Palanisamy, Giri

    Nobody is better suited to describe data than the scientist who created it. This description about a data is called Metadata. In general terms, Metadata represents the who, what, when, where, why and how of the dataset [1]. eXtensible Markup Language (XML) is the preferred output format for metadata, as it makes it portable and, more importantly, suitable for system discoverability. The newly developed ORNL Metadata Editor (OME) is a Web-based tool that allows users to create and maintain XML files containing key information, or metadata, about the research. Metadata include information about the specific projects, parameters, time periods, andmore » locations associated with the data. Such information helps put the research findings in context. In addition, the metadata produced using OME will allow other researchers to find these data via Metadata clearinghouses like Mercury [2][4]. OME is part of ORNL s Mercury software fleet [2][3]. It was jointly developed to support projects funded by the United States Geological Survey (USGS), U.S. Department of Energy (DOE), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA). OME s architecture provides a customizable interface to support project-specific requirements. Using this new architecture, the ORNL team developed OME instances for USGS s Core Science Analytics, Synthesis, and Libraries (CSAS&L), DOE s Next Generation Ecosystem Experiments (NGEE) and Atmospheric Radiation Measurement (ARM) Program, and the international Surface Ocean Carbon Dioxide ATlas (SOCAT). Researchers simply use the ORNL Metadata Editor to enter relevant metadata into a Web-based form. From the information on the form, the Metadata Editor can create an XML file on the server that the editor is installed or to the user s personal computer. Researchers can also use the ORNL Metadata Editor to modify existing XML metadata files. As an example, an NGEE Arctic scientist use OME to register their datasets to the NGEE data archive and allows the NGEE archive to publish these datasets via a data search portal (http://ngee.ornl.gov/data). These highly descriptive metadata created using OME allows the Archive to enable advanced data search options using keyword, geo-spatial, temporal and ontology filters. Similarly, ARM OME allows scientists or principal investigators (PIs) to submit their data products to the ARM data archive. How would OME help Big Data Centers like the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC)? The ORNL DAAC is one of NASA s Earth Observing System Data and Information System (EOSDIS) data centers managed by the Earth Science Data and Information System (ESDIS) Project. The ORNL DAAC archives data produced by NASA's Terrestrial Ecology Program. The DAAC provides data and information relevant to biogeochemical dynamics, ecological data, and environmental processes, critical for understanding the dynamics relating to the biological, geological, and chemical components of the Earth's environment. Typically data produced, archived and analyzed is at a scale of multiple petabytes, which makes the discoverability of the data very challenging. Without proper metadata associated with the data, it is difficult to find the data you are looking for and equally difficult to use and understand the data. OME will allow data centers like the NGEE and ORNL DAAC to produce meaningful, high quality, standards-based, descriptive information about their data products in-turn helping with the data discoverability and interoperability. Useful Links: USGS OME: http://mercury.ornl.gov/OME/ NGEE OME: http://ngee-arctic.ornl.gov/ngeemetadata/ ARM OME: http://archive2.ornl.gov/armome/ Contact: Ranjeet Devarakonda (devarakondar@ornl.gov) References: [1] Federal Geographic Data Committee. Content standard for digital geospatial metadata. Federal Geographic Data Committee, 1998. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94. [3] Wilson, B. E., Palanisamy, G., Devarakonda, R., Rhyne, B. T., Lindsley, C., & Green, J. (2010). Mercury Toolset for Spatiotemporal Metadata. [4] Pouchard, L. C., Branstetter, M. L., Cook, R. B., Devarakonda, R., Green, J., Palanisamy, G., ... & Noy, N. F. (2013). A Linked Science investigation: enhancing climate change data discovery with semantic technologies. Earth science informatics, 6(3), 175-185.« less

  8. 15 CFR 950.2 - Environmental Data and Information Service (EDIS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., archives, analyzes, and disseminates worldwide environmental (atmospheric, marine, solar, and solid Earth... other economic systems; and manages or provides functional guidance for NOAA's scientific and technical...

  9. 15 CFR 950.2 - Environmental Data and Information Service (EDIS).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., archives, analyzes, and disseminates worldwide environmental (atmospheric, marine, solar, and solid Earth... other economic systems; and manages or provides functional guidance for NOAA's scientific and technical...

  10. Development of a Web-Enabled Informatics Platform for Manipulation of Gene Expression Data

    DTIC Science & Technology

    2004-12-01

    genomic platforms such as metabolomics and proteomics , and to federated databases for knowledge management. A successful SBIR Phase I completed...measurements that require sophisticated bioinformatic platforms for data archival, management, integration, and analysis if researchers are to derive...web-enabled bioinformatic platform consisting of a Laboratory Information Management System (LIMS), an Analysis Information Management System (AIMS

  11. CDDIS Data Center Summary for the 2004 IVS Annual Report

    NASA Technical Reports Server (NTRS)

    Noll, Carey

    2005-01-01

    This report summarizes activities during the year 2004 and future plans of the Crustal Dynamics Data Information System (CDDIS) with respect to the International VLBI service for Geodesy and Astrometry (IVS). Included in this report are background information about the CDDIS, the computer architecture, staffing the support system, archive contents, and future plans for the CDDIS within the IVS.

  12. CDDIS Data Center Summary for the 2003 IVS Annual Report

    NASA Technical Reports Server (NTRS)

    Noll, Carey

    2004-01-01

    This report summarizes activities during the year 2003 and future plans of the Crustal Dynamics Data Information System (CDDIS) with respect to the International VLBI Service for Geodesy and Astrometry (IVS). Included in this report are background information about the CDDIS, the computer architecture, staffing supporting the system, archive contents, and future plans for the CDDIS within the IVS.

  13. Life Sciences Data Archive (LSDA) in the Post-Shuttle Era

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre

    2009-01-01

    Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management, the systematic use and reuse of information/experience/expertise to achieve a specific goal, are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were been collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries from the science communities.

  14. FBIS: A regional DNA barcode archival & analysis system for Indian fishes.

    PubMed

    Nagpure, Naresh Sahebrao; Rashid, Iliyas; Pathak, Ajey Kumar; Singh, Mahender; Singh, Shri Prakash; Sarkar, Uttam Kumar

    2012-01-01

    DNA barcode is a new tool for taxon recognition and classification of biological organisms based on sequence of a fragment of mitochondrial gene, cytochrome c oxidase I (COI). In view of the growing importance of the fish DNA barcoding for species identification, molecular taxonomy and fish diversity conservation, we developed a Fish Barcode Information System (FBIS) for Indian fishes, which will serve as a regional DNA barcode archival and analysis system. The database presently contains 2334 sequence records of COI gene for 472 aquatic species belonging to 39 orders and 136 families, collected from available published data sources. Additionally, it contains information on phenotype, distribution and IUCN Red List status of fishes. The web version of FBIS was designed using MySQL, Perl and PHP under Linux operating platform to (a) store and manage the acquisition (b) analyze and explore DNA barcode records (c) identify species and estimate genetic divergence. FBIS has also been integrated with appropriate tools for retrieving and viewing information about the database statistics and taxonomy. It is expected that FBIS would be useful as a potent information system in fish molecular taxonomy, phylogeny and genomics. The database is available for free at http://mail.nbfgr.res.in/fbis/

  15. Quick-look guide to the crustal dynamics project's data information system

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Behnke, Jeanne M.; Linder, Henry G.

    1987-01-01

    Described are the contents of the Crustal Dynamics Project Data Information System (DIS) and instructions on the use of this facility. The main purpose of the DIS is to store all geodetic data products acquired by the Project in a central data bank and to maintain information about the archive of all Project-related data. Access and use of the DIS menu-driven system is described as well as procedures for contacting DIS staff and submitting data requests.

  16. JPL, NASA and the Historical Record: Key Events/Documents in Lunar and Mars Exploration

    NASA Technical Reports Server (NTRS)

    Hooks, Michael Q.

    1999-01-01

    This document represents a presentation about the Jet Propulsion Laboratory (JPL) historical archives in the area of Lunar and Martian Exploration. The JPL archives documents the history of JPL's flight projects, research and development activities and administrative operations. The archives are in a variety of format. The presentation reviews the information available through the JPL archives web site, information available through the Regional Planetary Image Facility web site, and the information on past missions available through the web sites. The presentation also reviews the NASA historical resources at the NASA History Office and the National Archives and Records Administration.

  17. 77 FR 36297 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Reduction Act Comments (NHP), Room 4400, National Archives and Records Administration, 8601 Adelphi Rd...

  18. 76 FR 72449 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Reduction Act Comments (ISP), Room 4400, National Archives and Records Administration, 8601 Adelphi Rd...

  19. Picture Archiving And Communication Systems (PACS): Introductory Systems Analysis Considerations

    NASA Astrophysics Data System (ADS)

    Hughes, Simon H. C.

    1983-05-01

    Two fundamental problems face any hospital or radiology department that is thinking about installing a Picture Archiving and Communications System (PACS). First, though the need for PACS already exists, much of the relevant technology is just beginning to be developed. Second, the requirements of each hospital are different, so that any attempts to market a single PACS design for use in large numbers of hospitals are likely to meet with the same problems as were experienced with general-purpose Hospital Information Systems. This paper outlines some of the decision processes involved in arriving at specifications for each module of a PACS and indicates design principles which should be followed in order to meet individual hospital requirements, while avoiding the danger of short-term systems obsolescence.

  20. Goddard Atmospheric Composition Data Center: Aura Data and Services in One Place

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Kempler, S.; Gerasimov, I.; Ahmad, S.; Johnson, J.

    2005-01-01

    The Goddard Atmospheric Composition Data and Information Services Center (AC-DISC) is a portal to the Atmospheric Composition specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques for the better science data usage. It provides convenient access to Atmospheric Composition data and information from various remote-sensing missions, from TOMS, UARS, MODIS, and AIRS, to the most recent data from Aura OMI, MLS, HIRDLS (once these datasets are released to the public), as well as Atmospheric Composition datasets residing at other remote archive site.

  1. 76 FR 29012 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-19

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... Personnel Records Center (NPRC) of the National Archives and Records Administration (NARA) administers...

  2. 75 FR 66802 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice...: NATF 81, National Archives Order for Copies of Ship Passenger Arrival Records; NATF 82, National...

  3. Audit of a Scientific Data Center for Certification as a Trustworthy Digital Repository: A Case Study

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2011-12-01

    Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.

  4. Status of Mars Global Surveyor Science Data Archives

    NASA Technical Reports Server (NTRS)

    Slavney, S.; Arvidson, R. E.; Guinness, E. A.; Springer, R. J.

    2001-01-01

    The Mars Global Surveyor has been in orbit around Mars since September 1997, completing its primary mission on January 31, 2001. As of that date the spacecraft had completed more than 8000 mapping orbits. Data from its science instruments, radio science experiment, and SPICE files have been released regularly to the NASA Planetary Data System (PDS) as described in the MGS Archive Plan and Addendum and are available online. Additional information is contained in the original extended abstract.

  5. Using natural archives to detect climate and environmental tipping points in the Earth System

    NASA Astrophysics Data System (ADS)

    Thomas, Zoë A.

    2016-11-01

    'Tipping points' in the Earth system are characterised by a nonlinear response to gradual forcing, and may have severe and wide-ranging impacts. Many abrupt events result from simple underlying system dynamics termed 'critical transitions' or 'bifurcations'. One of the best ways to identify and potentially predict threshold behaviour in the climate system is through analysis of natural ('palaeo') archives. Specifically, on the approach to a tipping point, early warning signals can be detected as characteristic fluctuations in a time series as a system loses stability. Testing whether these early warning signals can be detected in highly complex real systems is a key challenge, since much work is either theoretical or only tested with simple models. This is particularly problematic in palaeoclimate and palaeoenvironmental records with low resolution, non-equidistant data, which can limit accurate analysis. Here, a range of different datasets are examined to explore generic rules that can be used to detect such dramatic events. A number of key criteria are identified to be necessary for the reliable identification of early warning signals in natural archives, most crucially, the need for a low-noise record of sufficient data length, resolution and accuracy. A deeper understanding of the underlying system dynamics is required to inform the development of more robust system-specific indicators, or to indicate the temporal resolution required, given a known forcing. This review demonstrates that time series precursors from natural archives provide a powerful means of forewarning tipping points within the Earth System.

  6. 77 FR 56234 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... not be viewed or advertised as an endorsement by the National Archives and Records Administration...

  7. Not the time or the place: the missing spatio-temporal link in publicly available genetic data.

    PubMed

    Pope, Lisa C; Liggins, Libby; Keyse, Jude; Carvalho, Silvia B; Riginos, Cynthia

    2015-08-01

    Genetic data are being generated at unprecedented rates. Policies of many journals, institutions and funding bodies aim to ensure that these data are publicly archived so that published results are reproducible. Additionally, publicly archived data can be 'repurposed' to address new questions in the future. In 2011, along with other leading journals in ecology and evolution, Molecular Ecology implemented mandatory public data archiving (the Joint Data Archiving Policy). To evaluate the effect of this policy, we assessed the genetic, spatial and temporal data archived for 419 data sets from 289 articles in Molecular Ecology from 2009 to 2013. We then determined whether archived data could be used to reproduce analyses as presented in the manuscript. We found that the journal's mandatory archiving policy has had a substantial positive impact, increasing genetic data archiving from 49 (pre-2011) to 98% (2011-present). However, 31% of publicly archived genetic data sets could not be recreated based on information supplied in either the manuscript or public archives, with incomplete data or inconsistent codes linking genetic data and metadata as the primary reasons. While the majority of articles did provide some geographic information, 40% did not provide this information as geographic coordinates. Furthermore, a large proportion of articles did not contain any information regarding date of sampling (40%). Although the inclusion of spatio-temporal data does require an increase in effort, we argue that the enduring value of publicly accessible genetic data to the molecular ecology field is greatly compromised when such metadata are not archived alongside genetic data. © 2015 John Wiley & Sons Ltd.

  8. Current status of the international Halley Watch infrared net archive

    NASA Technical Reports Server (NTRS)

    Mcguinness, Brian B.

    1988-01-01

    The primary purposes of the Halley Watch have been to promote Halley observations, coordinate and standardize the observing where useful, and to archive the results in a database readily accessible to cometary scientists. The intention of IHW is to store the observations themselves, along with any information necessary to allow users to understand and use the data, but to exclude interpretations of these data. Each of the archives produced by the IHW will appear in two versions: a printed archive and a digital archive on CD-ROMs. The archive is expected to have a very long lifetime. The IHW has already produced an archive for P/Crommelin. This consists of one printed volume and two 1600 bpi tapes. The Halley archive will contain at least twenty gigabytes of information.

  9. The BepiColombo Archive Core System (BACS)

    NASA Astrophysics Data System (ADS)

    Macfarlane, A. J.; Osuna, P.; Pérez-López, F.; Vallejo, J. C.; Martinez, S.; Arviset, C.; Casale, M.

    2015-09-01

    BepiColombo is an interdisciplinary ESA mission to explore the planet Mercury in cooperation with JAXA. The mission consists of two separate orbiters: ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (MMO), which are dedicated to the detailed study of the planet and its magnetosphere. The MPO scientific payload comprises 11 instruments covering different scientific disciplines developed by several European teams. The MPO science operations will be prepared by the MPO Science Ground Segment (SGS) located at the European Space Astronomy Centre (ESAC) in Madrid. The BepiColombo Archive Core System (BACS) will be the central archive in which all mission operational data will be stored and is being developed by the Science Archives and Virtual Observatory Team (SAT) also at ESAC. The BACS will act as one of the modular subsystems within the BepiColombo Science Operations Control System (BSCS), (Vallejo 2014; Pérez-López 2014) which is under the responsibility of the SGS, with the purpose of facilitating the information exchange of data and metadata between the other subsystems of the BSCS as well as with the MPO Instrument Teams. This paper gives an overview of the concept and design of the BACS and how it integrates into the science ground segment workflow.

  10. Electronic patient record and archive of records in Cardio.net system for telecardiology.

    PubMed

    Sierdziński, Janusz; Karpiński, Grzegorz

    2003-01-01

    In modern medicine the well structured patient data set, fast access to it and reporting capability become an important question. With the dynamic development of information technology (IT) such question is solved via building electronic patient record (EPR) archives. We then obtain fast access to patient data, diagnostic and treatment protocols etc. It results in more efficient, better and cheaper treatment. The aim of the work was to design a uniform Electronic Patient Record, implemented in cardio.net system for telecardiology allowing the co-operation among regional hospitals and reference centers. It includes questionnaires for demographic data and questionnaires supporting doctor's work (initial diagnosis, final diagnosis, history and physical, ECG at the discharge, applied treatment, additional tests, drugs, daily and periodical reports). The browser is implemented in EPR archive to facilitate data retrieval. Several tools for creating EPR and EPR archive were used such as: XML, PHP, Java Script and MySQL. The separate question is the security of data on WWW server. The security is ensured via Security Socket Layer (SSL) protocols and other tools. EPR in Cardio.net system is a module enabling the co-work of many physicians and the communication among different medical centers.

  11. An open, interoperable, and scalable prehospital information technology network architecture.

    PubMed

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  12. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.

  13. Archiving and access systems for remote sensing: Chapter 6

    USGS Publications Warehouse

    Faundeen, John L.; Percivall, George; Baros, Shirley; Baumann, Peter; Becker, Peter H.; Behnke, J.; Benedict, Karl; Colaiacomo, Lucio; Di, Liping; Doescher, Chris; Dominguez, J.; Edberg, Roger; Ferguson, Mark; Foreman, Stephen; Giaretta, David; Hutchison, Vivian; Ip, Alex; James, N.L.; Khalsa, Siri Jodha S.; Lazorchak, B.; Lewis, Adam; Li, Fuqin; Lymburner, Leo; Lynnes, C.S.; Martens, Matt; Melrose, Rachel; Morris, Steve; Mueller, Norman; Navale, Vivek; Navulur, Kumar; Newman, D.J.; Oliver, Simon; Purss, Matthew; Ramapriyan, H.K.; Rew, Russ; Rosen, Michael; Savickas, John; Sixsmith, Joshua; Sohre, Tom; Thau, David; Uhlir, Paul; Wang, Lan-Wei; Young, Jeff

    2016-01-01

    Focuses on major developments inaugurated by the Committee on Earth Observation Satellites, the Group on Earth Observations System of Systems, and the International Council for Science World Data System at the global level; initiatives at national levels to create data centers (e.g. the National Aeronautics and Space Administration (NASA) Distributed Active Archive Centers and other international space agency counterparts), and non-government systems (e.g. Center for International Earth Science Information Network). Other major elements focus on emerging tool sets, requirements for metadata, data storage and refresh methods, the rise of cloud computing, and questions about what and how much data should be saved. The sub-sections of the chapter address topics relevant to the science, engineering and standards used for state-of-the-art operational and experimental systems.

  14. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Wilson, Gregory S.; Huntress, Wesley T.

    1990-01-01

    The rationale behind Mission to Planet Earth is presented, and the program plan is described in detail. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to collect muultidisciplinary data. A sophisticated data system will process and archive an unprecedented large amount of information about the earth and how it functions as a system. Attention is given to the space observatories, the data and information systems, and the interdisciplinary research.

  15. THE PANCHROMATIC STARBURST IRREGULAR DWARF SURVEY (STARBIRDS): OBSERVATIONS AND DATA ARCHIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D., E-mail: kmcquinn@astro.umn.edu

    2015-06-22

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The datamore » sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging.« less

  16. 47 CFR 80.1125 - Search and rescue coordinating communications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Global Maritime Distress and Safety System (GMDSS... Archives and Records Administration (NARA). For information on the availability of this material at NARA...

  17. On-time reliability impacts of advanced traveler information services (ATIS). Volume II, Extensions and applications of the simulated yoked study concept

    DOT National Transportation Integrated Search

    2002-03-01

    In a simulated yoke study, estimates of roadway travel times are archived from web-based Advanced Traveler Information Systems (ATIS) and used to recreate hypothetical, retrospective paired driving trials between travelers with and without ATIS. Prev...

  18. [Research and implementation of the TLS network transport security technology based on DICOM standard].

    PubMed

    Lu, Xiaoqi; Wang, Lei; Zhao, Jianfeng

    2012-02-01

    With the development of medical information, Picture Archiving and Communications System (PACS), Hospital Information System/Radiology Information System(HIS/RIS) and other medical information management system become popular and developed, and interoperability between these systems becomes more frequent. So, these enclosed systems will be open and regionalized by means of network, and this is inevitable. If the trend becomes true, the security of information transmission may be the first problem to be solved. Based on the need for network security, we investigated the Digital Imaging and Communications in Medicine (DICOM) Standard and Transport Layer Security (TLS) Protocol, and implemented the TLS transmission of the DICOM medical information with OpenSSL toolkit and DCMTK toolkit.

  19. Buckets: Aggregative, Intelligent Agents for Publishing

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad

    1998-01-01

    Buckets are an aggregative, intelligent construct for publishing in digital libraries. The goal of research projects is to produce information. This information is often instantiated in several forms, differentiated by semantic types (report, software, video, datasets, etc.). A given semantic type can be further differentiated by syntactic representations as well (PostScript version, PDF version, Word version, etc.). Although the information was created together and subtle relationships can exist between them, different semantic instantiations are generally segregated along currently obsolete media boundaries. Reports are placed in report archives, software might go into a software archive, but most of the data and supporting materials are likely to be kept in informal personal archives or discarded altogether. Buckets provide an archive-independent container construct in which all related semantic and syntactic data types and objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services.

  20. Digital Archive Issues from the Perspective of an Earth Science Data Producer

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.

    2004-01-01

    Contents include the following: Introduction. A Producer Perspective on Earth Science Data. Data Producers as Members of a Scientific Community. Some Unique Characteristics of Scientific Data. Spatial and Temporal Sampling for Earth (or Space) Science Data. The Influence of the Data Production System Architecture. The Spatial and Temporal Structures Underlying Earth Science Data. Earth Science Data File (or Relation) Schemas. Data Producer Configuration Management Complexities. The Topology of Earth Science Data Inventories. Some Thoughts on the User Perspective. Science Data User Communities. Spatial and Temporal Structure Needs of Different Users. User Spatial Objects. Data Search Services. Inventory Search. Parameter (Keyword) Search. Metadata Searches. Documentation Search. Secondary Index Search. Print Technology and Hypertext. Inter-Data Collection Configuration Management Issues. An Archive View. Producer Data Ingest and Production. User Data Searching and Distribution. Subsetting and Supersetting. Semantic Requirements for Data Interchange. Tentative Conclusions. An Object Oriented View of Archive Information Evolution. Scientific Data Archival Issues. A Perspective on the Future of Digital Archives for Scientific Data. References Index for this paper.

  1. Development of multi-mission satellite data systems at the German Remote Sensing Data Centre

    NASA Astrophysics Data System (ADS)

    Lotz-Iwen, H. J.; Markwitz, W.; Schreier, G.

    1998-11-01

    This paper focuses on conceptual aspects of the access to multi-mission remote sensing data by online catalogue and information systems. The system ISIS of the German Remote Sensing Data Centre is described as an example of a user interface to earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online access to the database via public networks. It provides catalogue retrieval, visualisation and transfer of image data, and is integrated in international activities dedicated to catalogue and archive interoperability. Finally, an outlook is given on international projects dealing with access to remote sensing data in distributed archives.

  2. Diagnostic report acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Brooks, Everett G.; Rothman, Melvyn L.

    1991-07-01

    The Mayo Clinic and IBM Rochester have jointly developed a picture archive and control system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. One of the challenges of developing a useful PACS involves integrating the diagnostic reports with the electronic images so they can be displayed simultaneously. By the time a diagnostic report is generated for a particular case, its images have already been captured and archived by the PACS. To integrate the report with the images, the authors have developed an IBM Personal System/2 computer (PS/2) based diagnostic report acquisition unit (RAU). A typed copy of the report is transmitted via facsimile to the RAU where it is stacked electronically with other reports that have been sent previously but not yet processed. By processing these reports at the RAU, the information they contain is integrated with the image database and a copy of the report is archived electronically on an IBM Application System/400 computer (AS/400). When a user requests a set of images for viewing, the report is automatically integrated with the image data. By using a hot key, the user can toggle on/off the report on the display screen. This report describes process, hardware, and software employed to integrate the diagnostic report information into the PACS, including how the report images are captured, transmitted, and entered into the AS/400 database. Also described is how the archived reports and their associated medical images are located and merged for retrieval and display. The methods used to detect and process error conditions are also discussed.

  3. (abstract) Towards Ancillary Data Standards

    NASA Technical Reports Server (NTRS)

    Acton, Charles H.

    1997-01-01

    NASA's SPICE information system for archiving, distributing, and accessing spacecraft navigation, orientation, and other ancillary data is described. A proposal is made for the further evolution of this concept to an internationally useful standard, to be.

  4. 77 FR 76076 - Information Security Oversight Office; State, Local, Tribal, and Private Sector Policy Advisory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office; State, Local, Tribal, and Private Sector Policy Advisory Committee (SLTPS-PAC) AGENCY: National Archives and Records....m. to 12:00 noon. ADDRESSES: National Archives and Records Administration, 700 Pennsylvania Avenue...

  5. Legacy system integration using web technology

    NASA Astrophysics Data System (ADS)

    Kennedy, Richard L.; Seibert, James A.; Hughes, Chris J.

    2000-05-01

    As healthcare moves towards a completely digital, multimedia environment there is an opportunity to provide for cost- effective, highly distributed physician access to clinical information including radiology-based imaging. In order to address this opportunity a Universal Clinical Desktop (UCD) system was developed. A UCD provides a single point of entry into an integrated view of all types of clinical data available within a network of disparate healthcare information systems. In order to explore the application of a UCD in a hospital environment, a pilot study was established with the University of California Davis Medical Center using technology from Trilix Information Systems. Within this pilot environment the information systems integrated under the UCD include a radiology information system (RIS), a picture archive and communication system (PACS) and a laboratory information system (LIS).

  6. ESDORA: A Data Archive Infrastructure Using Digital Object Model and Open Source Frameworks

    NASA Astrophysics Data System (ADS)

    Shrestha, Biva; Pan, Jerry; Green, Jim; Palanisamy, Giriprakash; Wei, Yaxing; Lenhardt, W.; Cook, R. Bob; Wilson, B. E.; Leggott, M.

    2011-12-01

    There are an array of challenges associated with preserving, managing, and using contemporary scientific data. Large volume, multiple formats and data services, and the lack of a coherent mechanism for metadata/data management are some of the common issues across data centers. It is often difficult to preserve the data history and lineage information, along with other descriptive metadata, hindering the true science value for the archived data products. In this project, we use digital object abstraction architecture as the information/knowledge framework to address these challenges. We have used the following open-source frameworks: Fedora-Commons Repository, Drupal Content Management System, Islandora (Drupal Module) and Apache Solr Search Engine. The system is an active archive infrastructure for Earth Science data resources, which include ingestion, archiving, distribution, and discovery functionalities. We use an ingestion workflow to ingest the data and metadata, where many different aspects of data descriptions (including structured and non-structured metadata) are reviewed. The data and metadata are published after reviewing multiple times. They are staged during the reviewing phase. Each digital object is encoded in XML for long-term preservation of the content and relations among the digital items. The software architecture provides a flexible, modularized framework for adding pluggable user-oriented functionality. Solr is used to enable word search as well as faceted search. A home grown spatial search module is plugged in to allow user to make a spatial selection in a map view. A RDF semantic store within the Fedora-Commons Repository is used for storing information on data lineage, dissemination services, and text-based metadata. We use the semantic notion "isViewerFor" to register internally or externally referenced URLs, which are rendered within the same web browser when possible. With appropriate mapping of content into digital objects, many different data descriptions, including structured metadata, data history, auditing trails, are captured and coupled with the data content. The semantic store provides a foundation for possible further utilizations, including provide full-fledged Earth Science ontology for data interpretation or lineage tracking. Datasets from the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) as well as from the Synthesis Thematic Data Center (MAST-DC) are used in a testing deployment with the system. The testing deployment allows us to validate the features and values described here for the integrated system, which will be presented here. Overall, we believe that the integrated system is valid, reusable data archive software that provides digital stewardship for Earth Sciences data content, now and in the future. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2] Devarakonda, Ranjeet, et al. "Semantic search integration to climate data." Collaboration Technologies and Systems (CTS), 2014 International Conference on. IEEE, 2014.

  7. Outsourced central archiving: an information bridge in a multi-IMAC environment

    NASA Astrophysics Data System (ADS)

    Gustavsson, Staffan; Tylen, Ulf; Carlsson, Goeran; Angelhed, Jan-Erik; Wintell, Mikael; Helmersson, Roger; Norrby, Clas

    2001-08-01

    In 1998 three hospitals merged to form the Sahlgrenska University Hospital. The total radiology production became 325 000 examinations per year. Two different PACS and RIS with different and incompatible archiving solutions were used since 1996. One PACS had commercial origin and the other was developed inhouse. Together they managed 1/3 of the total production. Due to differences in standard compliance and system architecture the communication was unsatisfactory. In order to improve efficiency, communication and the service level to our customers the situation was evaluated. It was decided to build a transparent virtual radiology department based on a modular approach. A common RIS and a central DICOM image archive as the central nodes in a star configured system were chosen. Web technique was chosen as the solution for distribution of images and reports. The reasons for the decisions as well as the present status of the installation are described and discussed is this paper.

  8. What Is A Picture Archiving And Communication System (PACS)?

    NASA Astrophysics Data System (ADS)

    Marceau, Carla

    1982-01-01

    A PACS is a digital system for acquiring, storing, moving and displaying picture or image information. It is an alternative to film jackets that has been made possible by recent breakthroughs in computer technology: telecommunications, local area nets and optical disks. The fundamental concept of the digital representation of image information is introduced. It is shown that freeing images from a material representation on film or paper leads to a dramatic increase in flexibility in our use of the images. The ultimate goal of a medical PACS system is a radiology department without film jackets. The inherent nature of digital images and the power of the computer allow instant free "copies" of images to be made and thrown away. These copies can be transmitted to distant sites in seconds, without the "original" ever leaving the archives of the radiology department. The result is a radiology department with much freer access to patient images and greater protection against lost or misplaced image information. Finally, images in digital form can be treated as data for the computer in image processing, which includes enhancement, reconstruction and even computer-aided analysis.

  9. Optimisation of solar synoptic observations

    NASA Astrophysics Data System (ADS)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  10. Glossary | STORET Legacy Data Center | US EPA

    EPA Pesticide Factsheets

    2014-06-06

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  11. Organizations - I | STORET Legacy Data Center | US EPA

    EPA Pesticide Factsheets

    2007-05-16

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  12. 75 FR 14142 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... result in a contrary determination. ADDRESSES: You may submit comments, identified by docket number and... Archive/Management Information System, Defense Manpower Data Center, DoD Center, Monterey Bay, 400 Gigling..., Defense Manpower Data Center, 400 Gigling Road, Seaside, CA 93955-6771.'' Notification procedure: Delete...

  13. Glossary | STORET Legacy Data Center | US EPA

    EPA Pesticide Factsheets

    2011-02-14

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  14. Contacts | STORET Legacy Data Center | US EPA

    EPA Pesticide Factsheets

    2007-05-16

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  15. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  16. Digital Libraries and the Problem of Purpose [and] On DigiPaper and the Dissemination of Electronic Documents [and] DFAS: The Distributed Finding Aid Search System [and] Best Practices for Digital Archiving: An Information Life Cycle Approach [and] Mapping and Converting Essential Federal Geographic Data Committee (FGDC) Metadata into MARC21 and Dublin Core: Towards an Alternative to the FGDC Clearinghouse [and] Evaluating Website Modifications at the National Library of Medicine through Search Log analysis.

    ERIC Educational Resources Information Center

    Levy, David M.; Huttenlocher, Dan; Moll, Angela; Smith, MacKenzie; Hodge, Gail M.; Chandler, Adam; Foley, Dan; Hafez, Alaaeldin M.; Redalen, Aaron; Miller, Naomi

    2000-01-01

    Includes six articles focusing on the purpose of digital public libraries; encoding electronic documents through compression techniques; a distributed finding aid server; digital archiving practices in the framework of information life cycle management; converting metadata into MARC format and Dublin Core formats; and evaluating Web sites through…

  17. The preservation of LANDSAT data by the National Land Remote Sensing Archive

    NASA Technical Reports Server (NTRS)

    Boyd, John E.

    1992-01-01

    Digital data, acquired by the National Landsat Remote Sensing Program, document nearly two decades of global agricultural, environmental, and sociological change. The data were widely applied and continue to be essential to a variety of geologic, hydrologic, agronomic, and strategic programs and studies by governmental, academic, and commercial researchers. Landsat data were acquired by five observatories that use primarily two digital sensor systems. The Multispectral Scanner (MSS) was onboard all five Landsats, which have orbited over 19 years; the higher resolution Thematic Mapper (TM) sensor acquired data for the last 9 years on Landsats 4 and 5 only. The National Land Remote Sensing Archive preserves the 800,000 scenes, which total more than 60 terabytes of data, on master tapes that are steadily deteriorating. Data are stored at two terabytes of data, on master tapes that are steadily deteriorating. Data are stored at two locations (Sioux Falls, South Dakota and Landover, Maryland), in three archive formats. The U.S. Geological Survey's EROS Data Center has initiated a project to consolidate and convert, over the next 4 years, two of the archive formats from antiquated instrumentation tape to rotary-recorded cassette magnetic tape. The third archive format, consisting of 300,000 scenes of MSS data acquired from 1972 through 1978, will not be converted because of budgetary constraints. This data preservation project augments EDC's experience in data archiving and information management, expertise that is critical to EDC's role as a Distributed Active Archive Center for the Earth Observing System, a new and much larger national earth science program.

  18. BOOK REVIEW: Treasure-Hunting in Astronomical Plate Archives.

    NASA Astrophysics Data System (ADS)

    Kroll, Peter; La Dous, Constanze; Brauer, Hans-Juergen; Sterken, C.

    This book consists of the proceedings of a conference on the exploration of the invaluable scientific treasure present in astronomical plate archives worldwide. The book incorporates fifty scientific papers covering almost 250 pages. There are several most useful papers, such as, for example, an introduction to the world's large plate archives that serves the purpose of a guide for the beginning user of plate archives. It includes a very useful list of twelve mayor archives with many details on their advantages (completeness, number of plates, classification system and homogeneity of time coverage) and their limitations (plate quality, access, electronic catalogues, photographic services, limiting magnitudes, search software and cost to the user). Other topics cover available contemporary digitization machines, the applications of commercial flatbed scanners, technical aspects of plate consulting, astrophysical applications and astrometric uses, data reduction, data archiving and retrieval, and strategies to find astrophysically useful information on plates. The astrophysical coverage is very broad: from solar-system bodies to variable stars, sky surveys and sky patrols covering the galactic and extragalactic domain and even gravitational lensing. The book concludes by an illuminating paper on ALADIN, the reference tool for identification of astronomical sources. This work can be considered as a kind of field guide, and is recommended reading for anyone who wishes to undertake small- or large-scale consulting of photographic plate material. A shortcoming of the proceedings is the fact that very few papers have abstracts. BOOK REVIEW: Treasure-Hunting in Astronomical Plate Archives. Proceedings of the international workshop held at Sonneberg Observatory, March 4-6, 1999. Peter Kroll, Constanze la Dous and Hans-Juergen Brauer (Eds.)

  19. FBIS: A regional DNA barcode archival & analysis system for Indian fishes

    PubMed Central

    Nagpure, Naresh Sahebrao; Rashid, Iliyas; Pathak, Ajey Kumar; Singh, Mahender; Singh, Shri Prakash; Sarkar, Uttam Kumar

    2012-01-01

    DNA barcode is a new tool for taxon recognition and classification of biological organisms based on sequence of a fragment of mitochondrial gene, cytochrome c oxidase I (COI). In view of the growing importance of the fish DNA barcoding for species identification, molecular taxonomy and fish diversity conservation, we developed a Fish Barcode Information System (FBIS) for Indian fishes, which will serve as a regional DNA barcode archival and analysis system. The database presently contains 2334 sequence records of COI gene for 472 aquatic species belonging to 39 orders and 136 families, collected from available published data sources. Additionally, it contains information on phenotype, distribution and IUCN Red List status of fishes. The web version of FBIS was designed using MySQL, Perl and PHP under Linux operating platform to (a) store and manage the acquisition (b) analyze and explore DNA barcode records (c) identify species and estimate genetic divergence. FBIS has also been integrated with appropriate tools for retrieving and viewing information about the database statistics and taxonomy. It is expected that FBIS would be useful as a potent information system in fish molecular taxonomy, phylogeny and genomics. Availability The database is available for free at http://mail.nbfgr.res.in/fbis/ PMID:22715304

  20. Stewardship of NASA's Earth Science Data and Ensuring Long-Term Active Archives

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; Behnke, Jeanne

    2016-01-01

    Program, NASA has followed an open data policy, with non-discriminatory access to data with no period of exclusive access. NASA has well-established processes for assigning and or accepting datasets into one of 12 Distributed Active Archive Centers (DAACs) that are parts of EOSDIS. EOSDIS has been evolving through several information technology cycles, adapting to hardware and software changes in the commercial sector. NASA is responsible for maintaining Earth science data as long as users are interested in using them for research and applications, which is well beyond the life of the data gathering missions. For science data to remain useful over long periods of time, steps must be taken to preserve: (1) Data bits with no corruption, (2) Discoverability and access, (3) Readability, (4) Understandability, (5) Usability' and (6). Reproducibility of results. NASAs Earth Science data and Information System (ESDIS) Project, along with the 12 EOSDIS Distributed Active Archive Centers (DAACs), has made significant progress in each of these areas over the last decade, and continues to evolve its active archive capabilities. Particular attention is being paid in recent years to ensure that the datasets are published in an easily accessible and citable manner through a unified metadata model, a common metadata repository (CMR), a coherent view through the earthdata.gov website, and assignment of Digital Object Identifiers (DOI) with well-designed landing product information pages.

  1. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  2. National Space Science Data Center Information Model

    NASA Astrophysics Data System (ADS)

    Bell, E. V.; McCaslin, P.; Grayzeck, E.; McLaughlin, S. A.; Kodis, J. M.; Morgan, T. H.; Williams, D. R.; Russell, J. L.

    2013-12-01

    The National Space Science Data Center (NSSDC) was established by NASA in 1964 to provide for the preservation and dissemination of scientific data from NASA missions. It has evolved to support distributed, active archives that were established in the Planetary, Astrophysics, and Heliophysics disciplines through a series of Memoranda of Understanding. The disciplines took over responsibility for working with new projects to acquire and distribute data for community researchers while the NSSDC remained vital as a deep archive. Since 2000, NSSDC has been using the Archive Information Package to preserve data over the long term. As part of its effort to streamline the ingest of data into the deep archive, the NSSDC developed and implemented a data model of desired and required metadata in XML. This process, in use for roughly five years now, has been successfully used to support the identification and ingest of data into the NSSDC archive, most notably those data from the Planetary Data System (PDS) submitted under PDS3. A series of software packages (X-ware) were developed to handle the submission of data from the PDS nodes utilizing a volume structure. An XML submission manifest is generated at the PDS provider site prior to delivery to NSSDC. The manifest ensures the fidelity of PDS data delivered to NSSDC. Preservation metadata is captured in an XML object when NSSDC archives the data. With the recent adoption by the PDS of the XML-based PDS4 data model, there is an opportunity for the NSSDC to provide additional services to the PDS such as the preservation, tracking, and restoration of individual products (e.g., a specific data file or document), which was unfeasible in the previous PDS3 system. The NSSDC is modifying and further streamlining its data ingest process to take advantage of the PDS4 model, an important consideration given the ever-increasing amount of data being generated and archived by orbiting missions at the Moon and Mars, other active projects such as BRRISON, LADEE, MAVEN, INSIGHT, OSIRIS-REX and ground-based observatories. Streamlining the ingest process also benefits the continued processing of PDS3 data. We will report on our progress and status.

  3. JPEG 2000 in advanced ground station architectures

    NASA Astrophysics Data System (ADS)

    Chien, Alan T.; Brower, Bernard V.; Rajan, Sreekanth D.

    2000-11-01

    The integration and management of information from distributed and heterogeneous information producers and providers must be a key foundation of any developing imagery intelligence system. Historically, imagery providers acted as production agencies for imagery, imagery intelligence, and geospatial information. In the future, these imagery producers will be evolving to act more like e-business information brokers. The management of imagery and geospatial information-visible, spectral, infrared (IR), radar, elevation, or other feature and foundation data-is crucial from a quality and content perspective. By 2005, there will be significantly advanced collection systems and a myriad of storage devices. There will also be a number of automated and man-in-the-loop correlation, fusion, and exploitation capabilities. All of these new imagery collection and storage systems will result in a higher volume and greater variety of imagery being disseminated and archived in the future. This paper illustrates the importance-from a collection, storage, exploitation, and dissemination perspective-of the proper selection and implementation of standards-based compression technology for ground station and dissemination/archive networks. It specifically discusses the new compression capabilities featured in JPEG 2000 and how that commercially based technology can provide significant improvements to the overall imagery and geospatial enterprise both from an architectural perspective as well as from a user's prospective.

  4. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  5. Why Digital Data Collections Are Important

    ERIC Educational Resources Information Center

    Mitchell, Erik T.

    2012-01-01

    The silo is a well-worn metaphor in information systems used to illustrate separateness, isolation, and lack of connectivity. Through the many iterations of system development, libraries, archives, and museums (LAMs) have sought to avoid silos and find the sweet spot between interface design and metadata interoperability. This effort is being…

  6. Applied Information Systems Research Program (AISRP) Workshop 3 meeting proceedings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The third Workshop of the Applied Laboratory Systems Research Program (AISRP) met at the Univeristy of Colorado's Laboratory for Atmospheric and Space Physics in August of 1993. The presentations were organized into four sessions: Artificial Intelligence Techniques; Scientific Visualization; Data Management and Archiving; and Research and Technology.

  7. Legacy STORET Level 5 | STORET Legacy Data Center | US ...

    EPA Pesticide Factsheets

    2007-05-16

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  8. Physician Perceptions of Singleview: A Picture Archiving and Communications System (PACS) Federation Solution

    ERIC Educational Resources Information Center

    Kolowitz, Brian J.

    2012-01-01

    Information Technology is changing the face of medicine. Prior research has shown many physicians believe access to the complete Personal Health Record (PHR) would be beneficial to patient care. Many times these medical records are distributed across system and organizational boundaries. International standards committees, healthcare…

  9. On-line access to remote sensing data with the satellite-data information system (ISIS)

    NASA Astrophysics Data System (ADS)

    Strunz, G.; Lotz-Iwen, H.-J.

    1994-08-01

    The German Remote Sensing Data Center (DFD) is developing the satellite-data information system ISIS as central interface for users to access Earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online database access via public networks, and is integrated in the international activities dedicated to catalogue and archive interoperability. A prototype of ISIS is already in use within the German Processing and Archiving Facility for ERS-1 for the storage and retrieval of digital SAR quicklook products and for the Radarmap of Germany. An operational status of the system is envisaged for the launch of ERS-2. The paper in hand describes the underlying concepts of ISIS and the recent state of realization. It explains the overall structure of the system and the functionality of each of its components. Emphasis is put on the description of the advisory system, the catalogue retrieval, and the online access and transfer of image data. Finally, the integration into a future global environmental data network is outlined.

  10. Archival Services and Technologies for Scientific Data

    NASA Astrophysics Data System (ADS)

    Meyer, Jörg; Hardt, Marcus; Streit, Achim; van Wezel, Jos

    2014-06-01

    After analysis and publication, there is no need to keep experimental data online on spinning disks. For reliability and costs inactive data is moved to tape and put into a data archive. The data archive must provide reliable access for at least ten years following a recommendation of the German Science Foundation (DFG), but many scientific communities wish to keep data available much longer. Data archival is on the one hand purely a bit preservation activity in order to ensure the bits read are the same as those written years before. On the other hand enough information must be archived to be able to use and interpret the content of the data. The latter is depending on many also community specific factors and remains an areas of much debate among archival specialists. The paper describes the current practice of archival and bit preservation in use for different science communities at KIT for which a combination of organizational services and technical tools are required. The special monitoring to detect tape related errors, the software infrastructure in use as well as the service certification are discussed. Plans and developments at KIT also in the context of the Large Scale Data Management and Analysis (LSDMA) project are presented. The technical advantages of the T10 SCSI Stream Commands (SSC-4) and the Linear Tape File System (LTFS) will have a profound impact on future long term archival of large data sets.

  11. An Assessment of a Science Discipline Archive Against ISO 16363

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2016-12-01

    The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.

  12. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  13. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  14. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  15. Analysis of the access patterns at GSFC distributed active archive center

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore; Bedet, Jean-Jacques

    1996-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.

  16. Multifunction display system, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The design and construction of a multifunction display man/machine interface for use with a 4 pi IBM-360 System are described. The system is capable of displaying superimposed volatile alphanumeric and graphical data on a 512 x 512 element plasma panel, and holographically stored multicolor archival information. The volatile data may be entered from a keyboard or by means of an I/O interface to the 360 system. A 2-page memory local to the display is provided for storing the entered data. The archival data is stored as a phase hologram on a vinyl tape strip. This data is accessible by means of a rapid transport system which responds to inputs provided by the I/O channel on the keyboard. As many as 500 frames may be stored on a tape strip for access in under 6 seconds.

  17. NASA's EOSDIS Cumulus: Ingesting, Archiving, Managing, and Distributing from Commercial Cloud

    NASA Astrophysics Data System (ADS)

    Baynes, K.; Ramachandran, R.; Pilone, D.; Quinn, P.; Schuler, I.; Gilman, J.; Jazayeri, A.

    2017-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk will outline the motivation for this work, present the achievements and hurdles of the past 18 months and will chart a course for the future expansion of the Cumulus expansion. We will explore on not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud and how we are rising to meet those challenges through open collaboration and intentional stakeholder engagement.

  18. Department of Defense picture archiving and communication system acceptance testing: results and identification of problem components.

    PubMed

    Allison, Scott A; Sweet, Clifford F; Beall, Douglas P; Lewis, Thomas E; Monroe, Thomas

    2005-09-01

    The PACS implementation process is complicated requiring a tremendous amount of time, resources, and planning. The Department of Defense (DOD) has significant experience in developing and refining PACS acceptance testing (AT) protocols that assure contract compliance, clinical safety, and functionality. The DOD's AT experience under the initial Medical Diagnostic Imaging Support System contract led to the current Digital Imaging Network-Picture Archiving and Communications Systems (DIN-PACS) contract AT protocol. To identify the most common system and component deficiencies under the current DIN-PACS AT protocol, 14 tri-service sites were evaluated during 1998-2000. Sixteen system deficiency citations with 154 separate types of limitations were noted with problems involving the workstation, interfaces, and the Radiology Information System comprising more than 50% of the citations. Larger PACS deployments were associated with a higher number of deficiencies. The most commonly cited systems deficiencies were among the most expensive components of the PACS.

  19. The challenge of a data storage hierarchy

    NASA Technical Reports Server (NTRS)

    Ruderman, Michael

    1992-01-01

    A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.

  20. Wallops Ship Surveillance System

    NASA Technical Reports Server (NTRS)

    Smith, Donna C.

    2011-01-01

    Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.

  1. NASA's Earth Observing Data and Information System

    NASA Technical Reports Server (NTRS)

    Mitchell, Andrew E.; Behnke, Jeanne; Lowe, Dawn; Ramapriyan, H. K.

    2009-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of NASA Earth observation program for over 10 years. It is one of the largest civilian science information system in the US, performing ingest, archive and distribution of over 3 terabytes of data per day much of which is from NASA s flagship missions Terra, Aqua and Aura. The system supports a variety of science disciplines including polar processes, land cover change, radiation budget, and most especially global climate change. The EOSDIS data centers, collocated with centers of science discipline expertise, archive and distribute standard data products produced by science investigator-led processing systems. Key to the success of EOSDIS is the concept of core versus community requirements. EOSDIS supports a core set of services to meet specific NASA needs and relies on community-developed services to meet specific user needs. EOSDIS offers a metadata registry, ECHO (Earth Observing System Clearinghouse), through which the scientific community can easily discover and exchange NASA s Earth science data and services. Users can search, manage, and access the contents of ECHO s registries (data and services) through user-developed and community-tailored interfaces or clients. The ECHO framework has become the primary access point for cross-Data Center search-and-order of EOSDIS and other Earth Science data holdings archived at the EOSDIS data centers. ECHO s Warehouse Inventory Search Tool (WIST) is the primary web-based client for discovering and ordering cross-discipline data from the EOSDIS data centers. The architecture of the EOSDIS provides a platform for the publication, discovery, understanding and access to NASA s Earth Observation resources and allows for easy integration of new datasets. The EOSDIS also has developed several methods for incorporating socioeconomic data into its data collection. Over the years, we have developed several methods for determining needs of the user community including use of the American Customer Satisfaction Index and a broad metrics program.

  2. Presence Personalization and Persistence: A New Approach to Building Archives to Support Collaborative Research

    NASA Technical Reports Server (NTRS)

    McGlynn, Thomas A.

    2008-01-01

    We discuss approaches to building archives that support the way most science is done. Today research is done in formal teams and informal groups. However our on-line services are designed to work with a single user. We have begun prototyping a new approach to building archives in which support for collaborative research is built in from the start. We organize the discussion along three elements that we believe to be necessary for effective support: We must enable user presence in the archive environment; users must be able to interact. Users must be able to personalize the environment, adding data and capabilities useful to themselves and their team. These changes must be persistent: subsequent sessions must be able to build upon previous sessions. In building the archive we see the large multi-player interactive games as a paradigm of how this approach can work. These three 'P's are essential in gaming as well and we shall use insights from the gaming world and virtual reality systems like Second Life in our prototype.

  3. NASA's Earth Observing System Data and Information System - EOSDIS

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2011-01-01

    This slide presentation reviews the work of NASA's Earth Observing System Data and Information System (EOSDIS), a petabyte-scale archive of environmental data that supports global climate change research. The Earth Science Data Systems provide end-to-end capabilities to deliver data and information products to users in support of understanding the Earth system. The presentation contains photographs from space of recent events, (i.e., the effects of the tsunami in Japan, and the wildfires in Australia.) It also includes details of the Data Centers that provide the data to EOSDIS and Science Investigator-led Processing Systems. Information about the Land, Atmosphere Near-real-time Capability for EOS (LANCE) and some of the uses that the system has made possible are reviewed. Also included is information about how to access the data, and evolutionary plans for the future of the system.

  4. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  5. Making geospatial data in ASF archive readily accessible

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  6. Tackling the Four V's with NEXUS

    NASA Astrophysics Data System (ADS)

    Greguska, F. R., III; Gill, K. M.; Huang, T.; Jacob, J. C.; Quach, N.; Wilson, B. D.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) reports that over 15 petabytes (PB) of Earth observing information are archived among the 12 NASA Distributed Active Archive Centers (DAACs); with more being archived daily. The upcoming Surface Water & Ocean Topography (SWOT) mission is expected to generate about 26 PB of data in 3 years. NEXUS is a state of the art deep data analytic program developed at the Jet Propulsion Laboratory with the goal of providing near real-time analytic capabilities for this vast trove of data. Rather than develop analytic services on traditional file archives, NEXUS organizes data into tiles in order to provide a platform for horizontal computing. To provide near real-time analytic solutions for missions such as SWOT, a highly scalable data ingestion solution is developed to quickly bring data into NEXUS. In order to accomplish this formidable challenge, the "Four V's" (Volume, Velocity, Veracity, and Variety) of Big Data must be considered. NEXUS consists of an ingestion subsystem that handles the Volume of data by utilizing a generic tiling strategy that subsets a given dataset into smaller tiles. These tiles are then indexed by a search engine and stored in a NoSQL database for fast retrieval. In addition to handling the Volume of data being indexed, the NEXUS ingestion subsystem is built for horizontal scalability in order to manage the Velocity of incoming data. As the load on the system increases, the components of the ingestion subsystem can be scaled to provide more capacity. During ingestion, NEXUS also takes a unique approach to the Veracity and Variety of Earth observing information being ingested. By allowing the processing and tiling mechanisms to be customized for each dataset, the NEXUS ingest system can discard erroneous or missing data as well as adapt to the many different data structures and file formats that can be found in satellite observation data. This talk will focus on the functionality and architecture of the data ingestion subsystem that is a part of the NEXUS software architecture and how it relates to the Four V's of Big Data.

  7. Uniforming information management in Finnish Social Welfare.

    PubMed

    Laaksonen, Maarit; Kärki, Jarmo; Ailio, Erja

    2012-01-01

    This paper describes the phases and methods used in the National project for IT in Social Services in Finland (Tikesos). The main goals of Tikesos were to unify the client information systems in social services, to develop electronic documentation and to produce specifications for nationally organized electronic archive. The method of Enterprise Architecture was largely used in the project.

  8. Scalable Data Mining and Archiving for the Square Kilometre Array

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.

    2011-12-01

    As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.

  9. The Panchromatic STARBurst IRregular Dwarf Survey (STARBIRDS): Observations and Data Archive

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D.

    2015-06-01

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The data sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).

  10. A guide to the National Space Science Data Center

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This is the second edition of a document that was published to acquaint space and Earth research scientists with an overview of the services offered by the NSSDC. As previously stated, the NSSDC was established by NASA to be the long term archive for data from its space missions. However, the NSSDC has evolved into an organization that provides a multitude of services for scientists throughout the world. Brief articles are presented which discuss these services. At the end of each article is the name, address, and telephone number of the person to contact for additional information. Online Information and Data Systems, Electronic Access, Offline Data Archive, Value Added Services, Mass Storage Activities, and Computer Science Research are all detailed.

  11. Archive of sediment data from vibracores collected in 2010 offshore of the Mississippi barrier islands

    USGS Publications Warehouse

    Kelso, Kyle W.; Flocks, James G.

    2015-01-01

    Selection of the core site locations was based on geophysical surveys conducted around the islands from 2008 to 2010. The surveys, using acoustic systems to image and interpret the nearsurface stratigraphy, were conducted to investigate the geologic controls on island evolution. This data series serves as an archive of sediment data collected from August to September 2010, offshore of the Mississippi barrier islands. Data products, including descriptive core logs, core photographs, results of sediment grain-size analyses, sample location maps, and geographic information system (GIS) data files with accompanying formal Federal Geographic Data Committee (FDGC) metadata can be downloaded from the data products and downloads page.

  12. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  13. Commercial imagery archive, management, exploitation, and distribution project development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-10-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.

  14. Commercial imagery archive, management, exploitation, and distribution product development

    NASA Astrophysics Data System (ADS)

    Hollinger, Bruce; Sakkas, Alysa

    1999-12-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.

  15. Lecture archiving on a larger scale at the University of Michigan and CERN

    NASA Astrophysics Data System (ADS)

    Herr, Jeremy; Lougheed, Robert; Neal, Homer A.

    2010-04-01

    The ATLAS Collaboratory Project at the University of Michigan has been a leader in the area of collaborative tools since 1999. Its activities include the development of standards, software and hardware tools for lecture archiving, and making recommendations for videoconferencing and remote teaching facilities. Starting in 2006 our group became involved in classroom recordings, and in early 2008 we spawned CARMA, a University-wide recording service. This service uses a new portable recording system that we developed. Capture, archiving and dissemination of rich multimedia content from lectures, tutorials and classes are increasingly widespread activities among universities and research institutes. A growing array of related commercial and open source technologies is becoming available, with several new products introduced in the last couple years. As the result of a new close partnership between U-M and CERN IT, a market survey of these products was conducted and a summary of the results are presented here. It is informing an ambitious effort in 2009 to equip many CERN rooms with automated lecture archiving systems, on a much larger scale than before. This new technology is being integrated with CERN's existing webcast, CDS, and Indico applications.

  16. Clinical Views: Object-Oriented Views for Clinical Databases

    PubMed Central

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    1998-01-01

    We present here a prototype of a clinical information system for the archiving and the management of multimedia and temporally-oriented clinical data related to PTCA patients. The system is based on an object-oriented DBMS and supports multiple views and view schemas on patients' data. Remote data access is supported too.

  17. 75 FR 52992 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... records matter. The information will support adjustments in this offering that will improve the overall...

  18. Mapping the Socio-Technical Complexity of Australian Science: From Archival Authorities to Networks of Contextual Information

    ERIC Educational Resources Information Center

    McCarthy, Gavan; Evans, Joanne

    2007-01-01

    This article examines the evolution of a national register of the archives of science and technology in Australia and the related development of an archival informatics focused initially on people and their relationships to archival materials. The register was created in 1985 as an in-house tool for the Australian Science Archives Project of the…

  19. The Nation's Memory: The United States National Archives and Records Administration. An Interview with Don W. Wilson, Archivist of the United States, National Archives and Records Administration.

    ERIC Educational Resources Information Center

    Brodhead, Michael J.; Zink, Steven D.

    1993-01-01

    Discusses the National Archives and Records Administration (NARA) through an interview with the Archivist of the United States, Don Wilson. Topics addressed include archival independence and congressional relations; national information policy; expansion plans; machine-readable archival records; preservation activities; and relations with other…

  20. An XML-based Generic Tool for Information Retrieval in Solar Databases

    NASA Astrophysics Data System (ADS)

    Scholl, Isabelle F.; Legay, Eric; Linsolas, Romain

    This paper presents the current architecture of the `Solar Web Project' now in its development phase. This tool will provide scientists interested in solar data with a single web-based interface for browsing distributed and heterogeneous catalogs of solar observations. The main goal is to have a generic application that can be easily extended to new sets of data or to new missions with a low level of maintenance. It is developed with Java and XML is used as a powerful configuration language. The server, independent of any database scheme, can communicate with a client (the user interface) and several local or remote archive access systems (such as existing web pages, ftp sites or SQL databases). Archive access systems are externally described in XML files. The user interface is also dynamically generated from an XML file containing the window building rules and a simplified database description. This project is developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France). Successful tests have been conducted with other solar archive access systems.

  1. Metadata Design in the New PDS4 Standards - Something for Everybody

    NASA Astrophysics Data System (ADS)

    Raugh, Anne C.; Hughes, John S.

    2015-11-01

    The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.

  2. Attitudes of the Japanese public and doctors towards use of archived information and samples without informed consent: preliminary findings based on focus group interviews.

    PubMed

    Asai, Atsushi; Ohnishi, Motoki; Nishigaki, Etsuyo; Sekimoto, Miho; Fukuhara, Shunichi; Fukui, Tsuguya

    2002-01-09

    The purpose of this study is to explore laypersons' attitudes toward the use of archived (existing) materials such as medical records and biological samples and to compare them with the attitudes of physicians who are involved in medical research. Three focus group interviews were conducted, in which seven Japanese male members of the general public, seven female members of the general public and seven physicians participated. It was revealed that the lay public expressed diverse attitudes towards the use of archived information and samples without informed consent. Protecting a subject's privacy, maintaining confidentiality, and communicating the outcomes of studies to research subjects were regarded as essential preconditions if researchers were to have access to archived information and samples used for research without the specific informed consent of the subjects who provided the material. Although participating physicians thought that some kind of prior permission from subjects was desirable, they pointed out the difficulties involved in obtaining individual informed consent in each case. The present preliminary study indicates that the lay public and medical professionals may have different attitudes towards the use of archived information and samples without specific informed consent. This hypothesis, however, is derived from our focus groups interviews, and requires validation through research using a larger sample.

  3. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  4. 75 FR 10507 - Information Security Oversight Office; National Industrial Security Program Policy Advisory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office; National Industrial Security Program Policy Advisory Committee (NISPPAC) AGENCY: National Archives and Records... individuals planning to attend must be submitted to the Information Security Oversight Office (ISOO) no later...

  5. 77 FR 53921 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... following information collections: 1. Title: Request to Microfilm Records. OMB number: 3095-0017. Agency...

  6. 76 FR 40296 - Declassification of National Security Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION 36 CFR Part 1260 [FDMS NARA-11-0001] RIN 3095-AB64 Declassification of National Security Information AGENCY: National Archives and Records Administration. ACTION... classified national security information in records transferred to NARA's legal custody. The rule...

  7. 78 FR 45569 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2013-039] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA... make inquiries on their behalf and to release information and records related to their Freedom of...

  8. [Management and development of the dangerous preparation archive].

    PubMed

    Binetti, Roberto; Longo, Marcello; Scimonelli, Luigia; Costamagna, Francesca

    2006-01-01

    In the year 2000 an archive of dangerous preparations was created at the National Health Institute (Istituto Superiore di Sanità), following a principle included in the Directive 88/379/EEC on dangerous preparations, subsequently modified by the Directive 1999/45/EC, concerning the creation of a data bank on dangerous preparations in each European country. The information stored in the archive is useful for purposes of health consumer's and workers protection and prevention, and particularly in case of acute poisonings. The archive is fully informatised, therefore the companies can send the information using the web and the authorized Poison Centres can find the information on the archive using the web. In each Member State different procedures are in place to comply with the 1999/45/EC Directive; therefore an international coordination could be useful in order to create an European network of national data-banks on dangerous preparations.

  9. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  10. An Archive of Downscaled WCRP CMIP3 Climate Projections for Planning Applications in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.

    2007-12-01

    Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).

  11. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  12. 78 FR 70543 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ...) evaluations/job performance, deployment status, sensitive items (e.g., access and accountable badges), awards... room is inside a Sensitive Compartmented Information Facility (SCIF). Retention and disposal: Disposition pending (until the National Archives and Records Administration approves retention and disposal...

  13. Handbook of sensor technical characteristics

    NASA Astrophysics Data System (ADS)

    Tanner, S.

    1982-07-01

    Space and terrestrial applications remote sensor systems are described. Each sensor is presented separately. Information is included on its objectives, description, technical characteristics, data products obtained, data archives location, period of operation, and measurement and potential derived parameters. Each sensor is cross indexed.

  14. Handbook of sensor technical characteristics

    NASA Technical Reports Server (NTRS)

    Tanner, S.

    1982-01-01

    Space and terrestrial applications remote sensor systems are described. Each sensor is presented separately. Information is included on its objectives, description, technical characteristics, data products obtained, data archives location, period of operation, and measurement and potential derived parameters. Each sensor is cross indexed.

  15. Evolving the Living With a Star Data System Definition

    NASA Astrophysics Data System (ADS)

    Otranto, J.; Dijoseph, M.; Worrall, W.

    2003-04-01

    NASA’s Living With a Star (LWS) Program is a space weather-focused and applications-driven research program. The LWS Program is soliciting input from the solar, space physics, space weather, and climate science communities to develop a system that enables access to science data associated with these disciplines, and advances the development of discipline and interdisciplinary findings. The LWS Program will implement a data system that builds upon the existing and planned data capture, processing, and storage components put in place by individual spacecraft missions and also inter-project data management systems, such as active archives, deep archives, and multi-mission repositories. It is technically feasible for the LWS Program to integrate data from a broad set of resources, assuming they are either publicly accessible or access is permitted by the system’s administrators. The LWS Program data system will work in coordination with spacecraft mission data systems and science data repositories, integrating them into a common data representation. This common representation relies on a robust metadata definition that provides journalistic and technical data descriptions, plus linkages to supporting data products and tools. The LWS Program intends to become an enabling resource to PIs, interdisciplinary scientists, researchers, and students facilitating both access to a broad collection of science data, as well as the necessary supporting components to understand and make productive use of the data. For the LWS Program to represent science data that is physically distributed across various ground system elements, information about the data products stored on each system is collected through a series of LWS-created active agents. These active agents are customized to interface or interact with each one of these data systems, collect information, and forward updates to a single LWS-developed metadata broker. This broker, in turn, updates a centralized repository of LWS-specific metadata. A populated LWS metadata database is a single point-of-contact that can serve all users (the science community) with a “one-stop-shop” for data access. While data may not be physically stored in an LWS-specific repository, the LWS system enables data access from wherever the data are stored. Moreover, LWS provides the user access to information for understanding the data source, format, and calibration, enables access to ancillary and correlative data products, provides links to processing tools and models associated with the data, and any corresponding findings. The LWS may also support an active archive for solar, space physics, space weather, and climate data when these data would otherwise be discarded or archived off-line. This archive could potentially serve as a backup facility for LWS missions. This plan is developed based upon input already received from the science community; the architecture is based on system developed to date that have worked well on a smaller scale. The LWS Program continues to seek constructive input from the science community, examples of both successes and failures in dealing with science data systems, and insights regarding the obstacles between the current state-of-the-practice and this vision for the LWS Program data system.

  16. Geodetic Seamless Archive Centers Modernization - Information Technology for Exploiting the Data Explosion

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.

    2010-12-01

    Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.

  17. A MySQL Based EPICS Archiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Slominski

    2009-10-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less

  18. Picture archiving and communication systems (PACS).

    PubMed

    Gamsu, Gordon; Perez, Enrico

    2003-07-01

    Over the past 2 decades, groups of computer scientists, electronic design engineers, and physicians, in universities and industry, have worked to achieve an electronic environment for the practice of medicine and radiology. The radiology component of this revolution is often called PACS (picture archiving and communication systems). More recently it has become evident that the efficiencies and cost savings of PACS are realized when they are part of an enterprise-wide electronic medical record. The installation of PACS requires careful planning by all the various stakeholds over many months prior to installation. All of the users must be aware of the initial disruption that will occur as they become familiar with the systems. Modern fourth generation PACS is linked to radiology and hospital information systems. The PACS consist of electronic acquisition sites-a robust network intelligently managed by a server, multiple viewing sites, and an archive. The details of how these are linked and their workflow analysis determines the success of PACS. PACS evolves over time, components are frequently replaced, and so the users must expect continuous learning about new updates and improved functionality. The digital medical revolution is rapidly being adopted in many medical centers, improving patient care and the success of the institution.

  19. Global Gridded Data from the Goddard Earth Observing System Data Assimilation System (GEOS-DAS)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The Goddard Earth Observing System Data Assimilation System (GEOS-DAS) timeseries is a globally gridded atmospheric data set for use in climate research. This near real-time data set is produced by the Data Assimilation Office (DAO) at the NASA Goddard Space Flight Center in direct support of the operational EOS instrument product generation from the Terra (12/1999 launch), Aqua (05/2002 launch) and Aura (01/2004 launch) spacecrafts. The data is archived in the EOS Core System (ECS) at the Goddard Earth Sciences Data and Information Services Center/Distributed Active Archive Center (GES DISC DAAC). The data is only a selection of the products available from the GEOS-DAS. The data is organized chronologically in timeseries format to facilitate the computation of statistics. GEOS-DAS data will be available for the time period January 1, 2000, through present.

  20. Bridging the integration gap between imaging and information systems: a uniform data concept for content-based image retrieval in computer-aided diagnosis.

    PubMed

    Welter, Petra; Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno, Thomas M

    2011-01-01

    It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process.

  1. Bridging the integration gap between imaging and information systems: a uniform data concept for content-based image retrieval in computer-aided diagnosis

    PubMed Central

    Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno (né Lehmann), Thomas M

    2011-01-01

    It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process. PMID:21672913

  2. NASA's EOSDIS Cumulus: Ingesting, Archiving, Managing, and Distributing Earth Science Data from the Commercial Cloud

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Ramachandran, Rahul; Pilone, Dan; Quinn, Patrick; Gilman, Jason; Schuler, Ian; Jazayeri, Alireza

    2017-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk will outline the motivation for this work, present the achievements and hurdles of the past 18 months and will chart a course for the future expansion of the Cumulus expansion. We will explore on not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud and how we are rising to meet those challenges through open collaboration and intentional stakeholder engagement.

  3. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  4. Implementation of Consolidated HIS: Improving Quality and Efficiency of Healthcare

    PubMed Central

    Choi, Jinwook; Seo, Jeong-Wook; Chung, Chun Kee; Kim, Kyung-Hwan; Kim, Ju Han; Kim, Jong Hyo; Chie, Eui Kyu; Cho, Hyun-Jai; Goo, Jin Mo; Lee, Hyuk-Joon; Wee, Won Ryang; Nam, Sang Mo; Lim, Mi-Sun; Kim, Young-Ah; Yang, Seung Hoon; Jo, Eun Mi; Hwang, Min-A; Kim, Wan Suk; Lee, Eun Hye; Choi, Su Hi

    2010-01-01

    Objectives Adoption of hospital information systems offers distinctive advantages in healthcare delivery. First, implementation of consolidated hospital information system in Seoul National University Hospital led to significant improvements in quality of healthcare and efficiency of hospital management. Methods The hospital information system in Seoul National University Hospital consists of component applications: clinical information systems, clinical research support systems, administrative information systems, management information systems, education support systems, and referral systems that operate to generate utmost performance when delivering healthcare services. Results Clinical information systems, which consist of such applications as electronic medical records, picture archiving and communication systems, primarily support clinical activities. Clinical research support system provides valuable resources supporting various aspects of clinical activities, ranging from management of clinical laboratory tests to establishing care-giving procedures. Conclusions Seoul National University Hospital strives to move its hospital information system to a whole new level, which enables customized healthcare service and fulfills individual requirements. The current information strategy is being formulated as an initial step of development, promoting the establishment of next-generation hospital information system. PMID:21818449

  5. Earth observation archive activities at DRA Farnborough

    NASA Technical Reports Server (NTRS)

    Palmer, M. D.; Williams, J. M.

    1993-01-01

    Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.

  6. Characterizing Space Environments with Long-Term Space Plasma Archive Resources

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.

    2009-01-01

    A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.

  7. MODIS Information, Data, and Control System (MIDACS) system specifications and conceptual design

    NASA Technical Reports Server (NTRS)

    Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.

    1988-01-01

    The MODIS Information, Data, and Control System (MIDACS) Specifications and Conceptual Design Document discusses system level requirements, the overall operating environment in which requirements must be met, and a breakdown of MIDACS into component subsystems, which include the Instrument Support Terminal, the Instrument Control Center, the Team Member Computing Facility, the Central Data Handling Facility, and the Data Archive and Distribution System. The specifications include sizing estimates for the processing and storage capacities of each data system element, as well as traffic analyses of data flows between the elements internally, and also externally across the data system interfaces. The specifications for the data system, as well as for the individual planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, and data archive and distribution components, do not yet fully specify the data system in the complete manner needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams have not yet been formed; however, it was possible to develop the specifications and conceptual design based on the present concept of EosDIS, the Level-1 and Level-2 Functional Requirements Documents, the Operations Concept, and through interviews and meetings with key members of the scientific community.

  8. Preserving the Pyramid of STI Using Buckets

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt

    2004-01-01

    The product of research projects is information. Through the life cycle of a project, information comes from many sources and takes many forms. Traditionally, this body of information is summarized in a formal publication, typically a journal article. While formal publications enjoy the benefits of peer review and technical editing, they are also often compromises in media format and length. As such, we consider a formal publication to represent an abstract to a larger body of work: a pyramid of scientific and technical information (STI). While this abstract may be sufficient for some applications, an in-depth use or analysis is likely to require the supporting layers from the pyramid. We have developed buckets to preserve this pyramid of STI. Buckets provide an archive- and protocol-independent container construct in which all related information objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services. Buckets are an implementation of the Smart Object, Dumb Archive (SODA) DL model. In SODA, data objects are more important than the archives that hold them. Much of the functionality traditionally associated with archives is pushed down into the objects, such as enforcing terms and conditions, negotiating display, and content maintenance. In this paper, we discuss the motivation, design, and implication of bucket use in DLs with respect to grey literature.

  9. A New Way of Making Cultural Information Resources Visible on the Web: Museums and the Open Archive Initiative.

    ERIC Educational Resources Information Center

    Perkins, John

    Museums hold enormous amounts of information in collections management systems and publish academic and scholarly research in print journals, exhibition catalogs, virtual museum presentations, and community publications. Much of this rich content is unavailable to web search engines or otherwise gets lost in the vastness of the World Wide Web. The…

  10. Global positioning system survey data for active seismic and volcanic areas of eastern Sicily, 1994 to 2013

    PubMed Central

    Bonforte, Alessandro; Fagone, Sonia; Giardina, Carmelo; Genovese, Simone; Aiesi, Gianpiero; Calvagna, Francesco; Cantarero, Massimo; Consoli, Orazio; Consoli, Salvatore; Guglielmino, Francesco; Puglisi, Biagio; Puglisi, Giuseppe; Saraceno, Benedetto

    2016-01-01

    This work presents and describes a 20-year long database of GPS data collected by geodetic surveys over the seismically and volcanically active eastern Sicily, for a total of more than 6300 measurements. Raw data were initially collected from the various archives at the Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania—Osservatorio Etneo and organized in a single repository. Here, quality and completeness checks were performed, while all necessary supplementary information were searched, collected, validated and organized together with the relevant data. Once all data and information collections were completed, raw binary data were converted into the universal ASCII RINEX format; all data are provided in this format with the necessary information for precise processing. In order to make the data archive readily consultable, we developed software allowing the user to easily search and obtain the needed data by simple alphanumeric and geographic queries. PMID:27479914

  11. Global positioning system survey data for active seismic and volcanic areas of eastern Sicily, 1994 to 2013.

    PubMed

    Bonforte, Alessandro; Fagone, Sonia; Giardina, Carmelo; Genovese, Simone; Aiesi, Gianpiero; Calvagna, Francesco; Cantarero, Massimo; Consoli, Orazio; Consoli, Salvatore; Guglielmino, Francesco; Puglisi, Biagio; Puglisi, Giuseppe; Saraceno, Benedetto

    2016-08-01

    This work presents and describes a 20-year long database of GPS data collected by geodetic surveys over the seismically and volcanically active eastern Sicily, for a total of more than 6300 measurements. Raw data were initially collected from the various archives at the Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania-Osservatorio Etneo and organized in a single repository. Here, quality and completeness checks were performed, while all necessary supplementary information were searched, collected, validated and organized together with the relevant data. Once all data and information collections were completed, raw binary data were converted into the universal ASCII RINEX format; all data are provided in this format with the necessary information for precise processing. In order to make the data archive readily consultable, we developed software allowing the user to easily search and obtain the needed data by simple alphanumeric and geographic queries.

  12. Analysis of environmental sounds

    NASA Astrophysics Data System (ADS)

    Lee, Keansub

    Environmental sound archives - casual recordings of people's daily life - are easily collected by MPS players or camcorders with low cost and high reliability, and shared in the web-sites. There are two kinds of user generated recordings we would like to be able to handle in this thesis: Continuous long-duration personal audio and Soundtracks of short consumer video clips. These environmental recordings contain a lot of useful information (semantic concepts) related with activity, location, occasion and content. As a consequence, the environment archives present many new opportunities for the automatic extraction of information that can be used in intelligent browsing systems. This thesis proposes systems for detecting these interesting concepts on a collection of these real-world recordings. The first system is to segment and label personal audio archives - continuous recordings of an individual's everyday experiences - into 'episodes' (relatively consistent acoustic situations lasting a few minutes or more) using the Bayesian Information Criterion and spectral clustering. The second system is for identifying regions of speech or music in the kinds of energetic and highly-variable noise present in this real-world sound. Motivated by psychoacoustic evidence that pitch is crucial in the perception and organization of sound, we develop a noise-robust pitch detection algorithm to locate speech or music-like regions. To avoid false alarms resulting from background noise with strong periodic components (such as air-conditioning), a new scheme is added in order to suppress these noises in the domain of autocorrelogram. In addition, the third system is to automatically detect a large set of interesting semantic concepts; which we chose for being both informative and useful to users, as well as being technically feasible. These 25 concepts are associated with people's activities, locations, occasions, objects, scenes and sounds, and are based on a large collection of consumer videos in conjunction with user studies. We model the soundtrack of each video, regardless of its original duration, as a fixed-sized clip-level summary feature. For each concept, an SVM-based classifier is trained according to three distance measures (Kullback-Leibler, Bhattacharyya, and Mahalanobis distance). Detecting the time of occurrence of a local object (for instance, a cheering sound) embedded in a longer soundtrack is useful and important for applications such as search and retrieval in consumer video archives. We finally present a Markov-model based clustering algorithm able to identify and segment consistent sets of temporal frames into regions associated with different ground-truth labels, and at the same time to exclude a set of uninformative frames shared in common from all clips. The labels are provided at the clip level, so this refinement of the time axis represents a variant of Multiple-Instance Learning (MIL). Quantitative evaluation shows that the performance of our proposed approaches tested on the 60h personal audio archives or 1900 YouTube video clips is significantly better than existing algorithms for detecting these useful concepts in real-world personal audio recordings.

  13. New Developments in NOAA's Comprehensive Large Array-Data Stewardship System

    NASA Astrophysics Data System (ADS)

    Ritchey, N. A.; Morris, J. S.; Carter, D. J.

    2012-12-01

    The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.

  14. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  15. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  16. A Testbed Demonstration of an Intelligent Archive in a Knowledge Building System

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Morse, Steve; Yang, Wenli; Bonnlander, Brian; McConaughy, Gail; Di, Liping; Danks, David

    2005-01-01

    The last decade's influx of raw data and derived geophysical parameters from several Earth observing satellites to NASA data centers has created a data-rich environment for Earth science research and applications. While advances in hardware and information management have made it possible to archive petabytes of data and distribute terabytes of data daily to a broad community of users, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications in order to realize the full potential of these valuable datasets. In examining what is needed to enable this progress in the data provider environment that exists today and is expected to evolve in the next several years, we arrived at the concept of an Intelligent Archive in context of a Knowledge Building System (IA/KBS). Our prior work and associated papers investigated usage scenarios, required capabilities, system architecture, data volume issues, and supporting technologies. We identified six key capabilities of an IA/KBS: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting. Among these capabilities, large-scale data mining is perceived by many in the community to be an area of technical risk. One of the main reasons for this is that standard data mining research and algorithms operate on datasets that are several orders of magnitude smaller than the actual sizes of datasets maintained by realistic earth science data archives. Therefore, we defined a test-bed activity to implement a large-scale data mining algorithm in a pseudo-operational scale environment and to examine any issues involved. The application chosen for applying the data mining algorithm is wildfire prediction over the continental U.S. This paper reports a number of observations based on our experience with this test-bed. While proof-of-concept for data mining scalability and utility has been a major goal for the research reported here, it was not the only one. The other five capabilities of an WKBS named above have been considered as well, and an assessment of the implications of our experience for these other areas will also be presented. The lessons learned through the testbed effort and presented in this paper will benefit technologists, scientists, and system operators as they consider introducing IA/KBS capabilities into production systems.

  17. The DICOM Standard: A Brief Overview

    NASA Astrophysics Data System (ADS)

    Gibaud, Bernard

    The DICOM standard has now become the uncontested standard for the exchange and management of biomedical images. Everyone acknowledges its prominent role in the emergence of multi-vendor Picture Archiving and Communication Systems (PACS), and their successful integration with Hospital Information Systems and Radiology Information Systems, thanks to the Integrating the Healthcare Enterprise (IHE) initiative. We introduce here the basic concepts retained for the definition of objects and services in DICOM, with the hope that it will help the reader to find his or her way in the vast DICOM documentation available on the web.

  18. Migration Stories: Upgrading a PDS Archive to PDS4

    NASA Astrophysics Data System (ADS)

    Kazden, D. P.; Walker, R. J.; Mafi, J. N.; King, T. A.; Joy, S. P.; Moon, I. S.

    2015-12-01

    Increasing bandwidth, storage capacity and computational capabilities have greatly increased our ability to access data and use them. A significant challenge, however, is to make data archived under older standards useful in the new data environments. NASA's Planetary Data System (PDS) recently released version 4 of its information model (PDS4). PDS4 is an improvement and has advantages over previous versions. PDS4 adopts the XML standard for metadata and expresses structural requirements with XML Schema and content constraints by using Schematron. This allows for thorough validation by using off the shelf tools. This is a substantial improvement over previous PDS versions. PDS4 was designed to improve discoverability of products (resources) in a PDS archive. These additions allow for more uniform metadata harvesting from the collection level to the product level. New tools and services are being deployed that depend on the data adhering to the PDS4 model. However, the PDS has been an operational archive since 1989 and has large holdings that are compliant with previous versions of the PDS information model. The challenge is the make the older data accessible and useable with the new PDS4 based tools. To provide uniform utility and access to the entire archive the older data must be migrated to the PDS4 model. At the Planetary Plasma Interactions (PPI) Node of the PDS we've been actively planning and preparing to migrate our legacy archive to the new PDS4 standards for several years. With the release of the PDS4 standards we have begun the migration of our archive. In this presentation we will discuss the preparation of the data for the migration and how we are approaching this task. The presentation will consist of a series of stories to describe our experiences and the best practices we have learned.

  19. Computerization of the Archivo General de Indias: Strategies and Results, Part I.

    ERIC Educational Resources Information Center

    Gonzalez, Pedro

    1998-01-01

    Discusses the digital reformatting project of the Archivo General de Indias (AGI) in Seville, Spain; the goal was to design, develop, and implement an automated data system capable of integrated management of common functions of a historical archive. Describes the general system architecture, and the design and attributes of the information and…

  20. 33 CFR 169.15 - Incorporation by reference: Where can I get a copy of the publications mentioned in this part?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... inspection at the National Archives and Records Administration (NARA). For information on the availability of... Radio Equipment Forming Part of the Global Maritime Distress and Safety System (GMDSS) and for...

  1. IRIS Toxicological Review of Halogenated Platinum Salts and Platinum Compounds (External Review Draft)

    EPA Science Inventory

    EPA released the draft toxicological review for public comment under the Integrated Risk Information System (IRIS) Program. The original draft assessment (January 2009) has been archived but is available on this web page for the sake of transparency.

  2. ARES - A New Airborne Reflective Emissive Spectrometer

    DTIC Science & Technology

    2005-10-01

    Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a

  3. Using external data sources to improve audit trail analysis.

    PubMed

    Herting, R L; Asaro, P V; Roth, A C; Barnes, M R

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system where information available in the medical record is augmented with external information sources such as: database sources, Light-weight Directory Access Protocol (LDAP) server sources, and World Wide Web (WWW) database sources. We discuss several issues that arise when combining the information from each of these disparate information sources. Furthermore, we explain how the enhanced person specific information obtained can be used to determine user-patient relationships that might signify a motive for inappropriately accessing a patient's medical record.

  4. Enhancement of real-time EPICS IOC PV management for the data archiving system

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Ha

    2015-10-01

    The operation of a 100-MeV linear proton accelerator, the major driving values and experimental data need to be archived. According to the experimental conditions, different data are required. Functions that can add new data and delete data in real time need to be implemented. In an experimental physics and industrial control system (EPICS) input output controller (IOC), the value of process variables (PVs) are matched with the driving values and data. The PV values are archived in text file format by using the channel archiver. There is no need to create a database (DB) server, just a need for large hard disk. Through the web, the archived data can be loaded, and new PV values can be archived without stopping the archive engine. The details of the implementation of a data archiving system with channel archiver are presented, and some preliminary results are reported.

  5. Digital imaging and electronic patient records in pathology using an integrated department information system with PACS.

    PubMed

    Kalinski, Thomas; Hofmann, Harald; Franke, Dagmar-Sybilla; Roessner, Albert

    2002-01-01

    Picture archiving and communication systems have been widely used in radiology thus far. Owing to the progress made in digital photo technology, their use in medicine opens up further opportunities. In the field of pathology, digital imaging offers new possiblities for the documentation of macroscopic and microscopic findings. Digital imaging has the advantage that the data is permanently and readily available, independent of conventional archives. In the past, PACS was a separate entity. Meanwhile, however, PACS has been integrated in DIS, the department information system, which was also run separately in former times. The combination of these two systems makes the administration of patient data, findings and images easier. Moreover, thanks to the introduction of special communication standards, a data exchange between different department information systems and hospital information systems (HIS) is possible. This provides the basis for a communication platform in medicine, constituting an electronic patient record (EPR) that permits an interdisciplinary treatment of patients by providing data of findings and images from clinics treating the same patient. As the pathologic diagnosis represents a central and often therapy-determining component, it is of utmost importance to add pathologic diagnoses to the EPR. Furthermore, the pathologist's work is considerably facilitated when he is able to retrieve additional data from the patient file. In this article, we describe our experience gained with the combined PACS and DIS systems recently installed at the Department of Pathology, University of Magdeburg. Moreover, we evaluate the current situation and future prospects for PACS in pathology.

  6. The European Radiobiology Archives (ERA)--content, structure and use illustrated by an example.

    PubMed

    Gerber, G B; Wick, R R; Kellerer, A M; Hopewell, J W; Di Majo, V; Dudoignon, N; Gössner, W; Stather, J

    2006-01-01

    The European Radiobiology Archives (ERA), supported by the European Commission and the European Late Effect Project Group (EULEP), together with the US National Radiobiology Archives (NRA) and the Japanese Radiobiology Archives (JRA) have collected all information still available on long-term animal experiments, including some selected human studies. The archives consist of a database in Microsoft Access, a website, databases of references and information on the use of the database. At present, the archives contain a description of the exposure conditions, animal strains, etc. from approximately 350,000 individuals; data on survival and pathology are available from approximately 200,000 individuals. Care has been taken to render pathological diagnoses compatible among different studies and to allow the lumping of pathological diagnoses into more general classes. 'Forms' in Access with an underlying computer code facilitate the use of the database. This paper describes the structure and content of the archives and illustrates an example for a possible analysis of such data.

  7. [Information production by scientists and the history of science: typological study of personal archives].

    PubMed

    Silva, Maria Celina Soares de Mello E; Trancoso, Márcia Cristina Duarte

    2015-01-01

    This article addresses the study of document typology in the personal archives of scientists and its importance in the history of science studies and for the archivist's work. A brief history is presented of diplomatic to typological information, emphasizing that identifying document production activity as essential for its classification. The article illustrates personal archive characteristics as regards the diversity of documental types and, in particular, those belonging to physicists. Furthermore, it presents five examples of documental types found in the archives of physicists as examples of research in progress. It also highlights the elaboration of a glossary of different documental kinds and types found in the private archives of Museum of Astronomy and Related Sciences in Rio de Janeiro.

  8. U.S. Geological Survey archived data recovery in Texas, 2008-11

    USGS Publications Warehouse

    Wehmeyer, Loren L.; Reece, Brian D.

    2011-01-01

    The 2008–11 data rescue and recovery efforts by the U.S. Geological Survey (USGS) Texas Water Science Center resulted in an efficient workflow process, database, and Web user interface for scientists and citizens to access archived environmental information with practical applications. Much of this information is unique and has never been readily available to the public. The methods developed and lessons learned during this effort are now being applied to facilitate recovering archived information requested by USGS scientists, cooperators, and the general public.

  9. The design of a petabyte archive and distribution system for the NASA ECS project

    NASA Technical Reports Server (NTRS)

    Caulk, Parris M.

    1994-01-01

    The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.

  10. IFLA General Conference, 1984. Management and Technology Division. Section on Information Technology and Joint Meeting of the Round Table Audiovisual Media, the International Association for Sound Archives, and the International Association for Music Libraries. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Six papers on information technology, the development of information systems for Third World countries, handling of sound recordings, and library automation were presented at the 1984 IFLA conference. They include: (1) "Handling, Storage and Preservation of Sound Recordings under Tropical and Subtropical Climatic Conditions" (Dietrich…

  11. The clinical information system GastroBase: integration of image processing and laboratory communication.

    PubMed

    Kocna, P

    1995-01-01

    GastroBase, a clinical information system, incorporates patient identification, medical records, images, laboratory data, patient history, physical examination, and other patient-related information. Program modules are written in C; all data is processed using Novell-Btrieve data manager. Patient identification database represents the main core of this information systems. A graphic library developed in the past year and graphic modules with a special video-card enables the storing, archiving, and linking of different images to the electronic patient-medical-record. GastroBase has been running for more than four years in daily routine and the database contains more than 25,000 medical records and 1,500 images. This new version of GastroBase is now incorporated into the clinical information system of University Clinic in Prague.

  12. Scholarly Communication and Information Technology: Exploring the Impact of Changes in the Research Process on Archives. Rand Reprints.

    ERIC Educational Resources Information Center

    Michelson, Avra; Rothenberg, Jeff

    1993-01-01

    The report considers the interaction of trends in information technology and trends in research practices and the policy implications for archives. The information is divided into 4 sections. The first section, an "Overview of Information Technology Trends," discusses end-user computing, which includes ubiquitous computing, end-user…

  13. The NASA Planetary Data System Roadmap Study for 2017 - 2026

    NASA Astrophysics Data System (ADS)

    McNutt, R. L., Jr.; Gaddis, L. R.; Law, E.; Beyer, R. A.; Crombie, M. K.; Ebel, D. S. S.; Ghosh, A.; Grayzeck, E.; Morgan, T. H.; Paganelli, F.; Raugh, A.; Stein, T.; Tiscareno, M. S.; Weber, R. C.; Banks, M.; Powell, K.

    2017-12-01

    NASA's Planetary Data System (PDS) is the formal archive of >1.2 petabytes of data from planetary exploration, science, and research. Initiated in 1989 to address an overall lack of attention to mission data documentation, access, and archiving, the PDS has evolved into an online collection of digital data managed and served by a federation of six science discipline nodes and two technical support nodes. Several ad hoc mission-oriented data nodes also provide complex data interfaces and access for the duration of their missions. The recent Planetary Data System Roadmap Study for 2017 to 2026 involved 15 planetary science community members who collectively prepared a report summarizing the results of an intensive examination of the current state of the PDS and its organization, management, practices, and data holdings (https://pds.jpl.nasa.gov/roadmap/PlanetaryDataSystemRMS17-26_20jun17.pdf). The report summarizes the history of the PDS, its functions and characteristics, and how it has evolved to its present form; also included are extensive references and documentary appendices. The report recognizes that as a complex, evolving, archive system, the PDS must constantly respond to new pressures and opportunities. The report provides details on the challenges now facing the PDS, 19 detailed findings, suggested remediations, and a summary of what the future may hold for planetary data archiving. The findings cover topics such as user needs and expectations, data usability and discoverability (i.e., metadata, data access, documentation, and training), tools and file formats, use of current information technologies, and responses to increases in data volume, variety, complexity, and number of data providers. In addition, the study addresses the possibility of archiving software, laboratory data, and measurements of physical samples. Finally, the report discusses the current structure and governance of the PDS and its impact on how archive growth, technology, and new developments are enabled and managed within the PDS. The report, with its findings, acknowledges the ongoing and expected challenges to be faced in the future, the need for maintaining an edge in the use of emerging technologies, and represents a guide for evolution of the PDS for the next decade.

  14. How Do Students Organize Personal Information Spaces?

    ERIC Educational Resources Information Center

    Hardof-Jaffe, Sharon; Hershkovitz, Arnon; Abu-Kishk, Hama; Bergman, Ofer; Nachmias, Rafi

    2009-01-01

    The purpose of this study is to empirically reveal strategies of students' organization of learning-related digital materials within an online personal information archive. Research population included 518 students who utilized the personal Web space allocated to them on the university servers for archiving information items, and data describing…

  15. New Directions in Library and Information Science Education. Final Report. Volume 2.9: Archivist/Museum Professional Competencies.

    ERIC Educational Resources Information Center

    Griffiths, Jose-Marie; And Others

    This document contains validated activities and competencies needed by information professionals working in an archive or museum. The activities and competencies are organized according to the functions which information professionals in archives or museums perform: acquisitions; cataloging/indexing; reference; exhibit management; and…

  16. NASA's Earth Science Data Systems - Lessons Learned and Future Directions

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2010-01-01

    In order to meet the increasing demand for Earth Science data, NASA has significantly improved the Earth Science Data Systems over the last two decades. This improvement is reviewed in this slide presentation. Many Earth Science disciplines have been able to access the data that is held in the Earth Observing System (EOS) Data and Information System (EOSDIS) at the Distributed Active Archive Centers (DAACs) that forms the core of the data system.

  17. Ames Life Science Data Archive: Translational Rodent Research at Ames

    NASA Technical Reports Server (NTRS)

    Wood, Alan E.; French, Alison J.; Ngaotheppitak, Ratana; Leung, Dorothy M.; Vargas, Roxana S.; Maese, Chris; Stewart, Helen

    2014-01-01

    The Life Science Data Archive (LSDA) office at Ames is responsible for collecting, curating, distributing and maintaining information pertaining to animal and plant experiments conducted in low earth orbit aboard various space vehicles from 1965 to present. The LSDA will soon be archiving data and tissues samples collected on the next generation of commercial vehicles; e.g., SpaceX & Cygnus Commercial Cargo Craft. To date over 375 rodent flight experiments with translational application have been archived by the Ames LSDA office. This knowledge base of fundamental research can be used to understand mechanisms that affect higher organisms in microgravity and help define additional research whose results could lead the way to closing gaps identified by the Human Research Program (HRP). This poster will highlight Ames contribution to the existing knowledge base and how the LSDA can be a resource to help answer the questions surrounding human health in long duration space exploration. In addition, it will illustrate how this body of knowledge was utilized to further our understanding of how space flight affects the human system and the ability to develop countermeasures that negate the deleterious effects of space flight. The Ames Life Sciences Data Archive (ALSDA) includes current descriptions of over 700 experiments conducted aboard the Shuttle, International Space Station (ISS), NASA/MIR, Bion/Cosmos, Gemini, Biosatellites, Apollo, Skylab, Russian Foton, and ground bed rest studies. Research areas cover Behavior and Performance, Bone and Calcium Physiology, Cardiovascular Physiology, Cell and Molecular Biology, Chronobiology, Developmental Biology, Endocrinology, Environmental Monitoring, Gastrointestinal Physiology, Hematology, Immunology, Life Support System, Metabolism and Nutrition, Microbiology, Muscle Physiology, Neurophysiology, Pharmacology, Plant Biology, Pulmonary Physiology, Radiation Biology, Renal, Fluid and Electrolyte Physiology, and Toxicology. These experiment descriptions and data can be accessed online via the public LSDA website (http://lsda.jsc.nasa.gov) and information can be requested via the Data Request form at http://lsda.jsc.nasa.gov/common/dataRequest/dataRequest.aspx or by contacting the ALSDA Office at: Alison.J.French@nasa.gov

  18. 32 CFR 2001.34 - Referrals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... actions and decisions in a manner that facilitates archival processing for public access. Central agency... Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Declassification § 2001.34...

  19. Announcing a Community Effort to Create an Information Model for Research Software Archives

    NASA Astrophysics Data System (ADS)

    Million, C.; Brazier, A.; King, T.; Hayes, A.

    2018-04-01

    An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.

  20. Informatics in radiology: RADTF: a semantic search-enabled, natural language processor-generated radiology teaching file.

    PubMed

    Do, Bao H; Wu, Andrew; Biswal, Sandip; Kamaya, Aya; Rubin, Daniel L

    2010-11-01

    Storing and retrieving radiology cases is an important activity for education and clinical research, but this process can be time-consuming. In the process of structuring reports and images into organized teaching files, incidental pathologic conditions not pertinent to the primary teaching point can be omitted, as when a user saves images of an aortic dissection case but disregards the incidental osteoid osteoma. An alternate strategy for identifying teaching cases is text search of reports in radiology information systems (RIS), but retrieved reports are unstructured, teaching-related content is not highlighted, and patient identifying information is not removed. Furthermore, searching unstructured reports requires sophisticated retrieval methods to achieve useful results. An open-source, RadLex(®)-compatible teaching file solution called RADTF, which uses natural language processing (NLP) methods to process radiology reports, was developed to create a searchable teaching resource from the RIS and the picture archiving and communication system (PACS). The NLP system extracts and de-identifies teaching-relevant statements from full reports to generate a stand-alone database, thus converting existing RIS archives into an on-demand source of teaching material. Using RADTF, the authors generated a semantic search-enabled, Web-based radiology archive containing over 700,000 cases with millions of images. RADTF combines a compact representation of the teaching-relevant content in radiology reports and a versatile search engine with the scale of the entire RIS-PACS collection of case material. ©RSNA, 2010

  1. ACE: A distributed system to manage large data archives

    NASA Technical Reports Server (NTRS)

    Daily, Mike I.; Allen, Frank W.

    1993-01-01

    Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.

  2. A Socio-technical assessment of the success of picture archiving and communication systems: the radiology technologist’s perspective

    PubMed Central

    2013-01-01

    Background With the increasing prevalence of Picture Archiving and Communication Systems (PACS) in healthcare institutions, there is a growing need to measure their success. However, there is a lack of published literature emphasizing the technical and social factors underlying a successful PACS. Methods An updated Information Systems Success Model was utilized by radiology technologists (RTs) to evaluate the success of PACS at a large medical center in Taiwan. A survey, consisting of 109 questionnaires, was analyzed by Structural Equation Modeling. Results Socio-technical factors (including system quality, information quality, service quality, perceived usefulness, user satisfaction, and PACS dependence) were proven to be effective measures of PACS success. Although the relationship between service quality and perceived usefulness was not significant, other proposed relationships amongst the six measurement parameters of success were all confirmed. Conclusions Managers have an obligation to improve the attributes of PACS. At the onset of its deployment, RTs will have formed their own subjective opinions with regards to its quality (system quality, information quality, and service quality). As these personal concepts are either refuted or reinforced based on personal experiences, RTs will become either satisfied or dissatisfied with PACS, based on their perception of its usefulness or lack of usefulness. A satisfied RT may play a pivotal role in the implementation of PACS in the future. PMID:24053458

  3. Semi-automated Data Set Submission Work Flow for Archival with the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wright, D.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Eby, P.; Heinz, S. L.; Hook, L. A.; McMurry, B. F.; Shanafield, H. A.; Sill, D.; Santhana Vannan, S.; Wei, Y.

    2013-12-01

    The ORNL DAAC archives and publishes, free of charge, data and information relevant to biogeochemical, ecological, and environmental processes. The ORNL DAAC primarily archives data produced by NASA's Terrestrial Ecology Program; however, any data that are pertinent to the biogeochemical and ecological community are of interest. The data set submission process to the ORNL DAAC has been recently updated and semi-automated to provide a consistent data provider experience and to create a uniform data product. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. If the ORNL DAAC is the appropriate archive for a data set, the data provider will be sent an email with several URL links to guide them through the submission process. The data provider will be asked to fill out a short online form to help the ORNL DAAC staff better understand the data set. These questions cover information about the data set, a description of the data set, temporal and spatial characteristics of the data set, and how the data were prepared and delivered. The questionnaire is generic and has been designed to gather input on the various diverse data sets the ORNL DAAC archives. A data upload module and metadata editor further guide the data provider through the submission process. For submission purposes, a complete data set includes data files, document(s) describing data, supplemental files, metadata record(s), and an online form. There are five major functions the ORNL DAAC performs during the process of archiving data: 1) Ingestion is the ORNL DAAC side of submission; data are checked, metadata records are compiled, and files are converted to archival formats. 2) Metadata records and data set documentation made searchable and the data set is given a permanent URL. 3) The data set is published, assigned a DOI, and advertised. 4) The data set is provided long-term post-project support. 5) Stewardship of data ensures the data are stored on state of the art computer systems with reliable backups.

  4. Between a Map and a Data Rod

    NASA Technical Reports Server (NTRS)

    Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce

    2015-01-01

    A Digital Divide has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or maps) and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported data rods project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectivesconstraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly (virtual) data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a years worth of time series for hourly data (9,000 time steps) in 90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.

  5. Between a Map and a Data Rod

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Rui, H.; Strub, R. F.; Vollmer, B.

    2015-12-01

    A "Digital Divide" has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or "maps") and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported "data rods" project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectives/constraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly ("virtual") data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a year's worth of time series for hourly data (~9,000 time steps) in ~90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.

  6. Building a virtual archive using brain architecture and Web 3D to deliver neuropsychopharmacology content over the Internet.

    PubMed

    Mongeau, R; Casu, M A; Pani, L; Pillolla, G; Lianas, L; Giachetti, A

    2008-05-01

    The vast amount of heterogeneous data generated in various fields of neurosciences such as neuropsychopharmacology can hardly be classified using traditional databases. We present here the concept of a virtual archive, spatially referenced over a simplified 3D brain map and accessible over the Internet. A simple prototype (available at http://aquatics.crs4.it/neuropsydat3d) has been realized using current Web-based virtual reality standards and technologies. It illustrates how primary literature or summary information can easily be retrieved through hyperlinks mapped onto a 3D schema while navigating through neuroanatomy. Furthermore, 3D navigation and visualization techniques are used to enhance the representation of brain's neurotransmitters, pathways and the involvement of specific brain areas in any particular physiological or behavioral functions. The system proposed shows how the use of a schematic spatial organization of data, widely exploited in other fields (e.g. Geographical Information Systems) can be extremely useful to develop efficient tools for research and teaching in neurosciences.

  7. Archive of Boomer seismic reflection data: collected during USGS Cruise 96CCT01, nearshore south central South Carolina coast, June 26 - July 1, 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Wiese, Dana S.

    2003-01-01

    This archive consists of marine seismic reflection profile data collected in four survey areas from southeast of Charleston Harbor to the mouth of the North Edisto River of South Carolina. These data were acquired June 26 - July 1, 1996, aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper Text Markup Language (HTML), Portable Document Format (PDF), Rich Text Format (RTF), Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images, and shapefiles. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) map documents provided were created with Environmental Systems Research Institute (ESRI) GIS software ArcView 3.2 and 8.1.

  8. Spanish historical sources to reconstruct climate in the Americas during the XIXth Century

    NASA Astrophysics Data System (ADS)

    García-Herrera, R.; Rubio, F.; Prieto, M.; Hernández, E.; Gimeno, L.

    2001-12-01

    The Spanish colonization of the Americas expanded since the beginning of the XVIth century until the beginning of the XIXth century, when most of the colonies became independent. During this period, a large amount of documentary information was produced, due to the fact that the Spanish Empire was highly centralized and bureaucracy was one of its core elements. Most of these documents are well preserved either in local archives in the Americas or in the Archivo General de Indias in Sevilla, which keeps thousands of bundles relative to any relevant aspect of the ordinary life of the colonies. Different projects are now searching climatic information in this archive with very encouraging results. During the XIXth century Spain kept two colonies in the Americas: Cuba and Puerto Rico, which became independent in 1898. This has allowed that a lot of information survived in Spanish Archives for this period. After a preliminary inspection of different Spanish Archives: Archivo General de Indias, Archivo del Museo Naval and Archivo Histórico Nacional (General Archive of Indies, Archive of the Naval Museum and National Historic Archive), it has been possible to identify two main areas of climatic interest: 1) information from ship logbooks connecting Spain with Cuba and Puerto Rico and 2) reports about hurricanes. The information contained in the ship logbooks is very rich and could help to better characterize elements of the large-scale circulation in the Atlantic; the reports on hurricanes can be very detailed and were elaborated by very skilled personnel. The presentation will provide different examples of the potential of these sources and describe different Spanish projects involved in the abstraction of this type of data.

  9. Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.

    2014-12-01

    Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.

  10. National Space Science Data Center data archive and distribution service (NDADS) automated retrieval mail system user's guide

    NASA Technical Reports Server (NTRS)

    Perry, Charleen M.; Vansteenberg, Michael E.

    1992-01-01

    The National Space Science Data Center (NSSDC) has developed an automated data retrieval request service utilizing our Data Archive and Distribution Service (NDADS) computer system. NDADS currently has selected project data written to optical disk platters with the disks residing in a robotic 'jukebox' near-line environment. This allows for rapid and automated access to the data with no staff intervention required. There are also automated help information and user services available that can be accessed. The request system permits an average-size data request to be completed within minutes of the request being sent to NSSDC. A mail message, in the format described in this document, retrieves the data and can send it to a remote site. Also listed in this document are the data currently available.

  11. EarthExplorer

    USGS Publications Warehouse

    Houska, Treva

    2012-01-01

    The EarthExplorer trifold provides basic information for on-line access to remotely-sensed data from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center archive. The EarthExplorer (http://earthexplorer.usgs.gov/) client/server interface allows users to search and download aerial photography, satellite data, elevation data, land-cover products, and digitized maps. Minimum computer system requirements and customer service contact information also are included in the brochure.

  12. Structural and Functional Concepts in Current Mouse Phenotyping and Archiving Facilities

    PubMed Central

    Kollmus, Heike; Post, Rainer; Brielmeier, Markus; Fernández, Julia; Fuchs, Helmut; McKerlie, Colin; Montoliu, Lluis; Otaegui, Pedro J; Rebelo, Manuel; Riedesel, Hermann; Ruberte, Jesús; Sedlacek, Radislav; de Angelis, Martin Hrabě; Schughart, Klaus

    2012-01-01

    Collecting and analyzing available information on the building plans, concepts, and workflow from existing animal facilities is an essential prerequisite for most centers that are planning and designing the construction of a new animal experimental research unit. Here, we have collected and analyzed such information in the context of the European project Infrafrontier, which aims to develop a common European infrastructure for high-throughput systemic phenotyping, archiving, and dissemination of mouse models. A team of experts visited 9 research facilities and 3 commercial breeders in Europe, Canada, the United States, and Singapore. During the visits, detailed data of each facility were collected and subsequently represented in standardized floor plans and descriptive tables. These data showed that because the local needs of scientists and their projects, property issues, and national and regional laws require very specific solutions, a common strategy for the construction of such facilities does not exist. However, several basic concepts were apparent that can be described by standardized floor plans showing the principle functional units and their interconnection. Here, we provide detailed information of how individual facilities addressed their specific needs by using different concepts of connecting the principle units. Our analysis likely will be valuable to research centers that are planning to design new mouse phenotyping and archiving facilities. PMID:23043807

  13. Archival Administration in the Electronic Information Age: An Advanced Institute for Government Archivists (2nd, Pittsburgh, Pennsylvania, June 3-14, 1991).

    ERIC Educational Resources Information Center

    Pittsburgh Univ., PA. Graduate School of Library and Information Sciences.

    This report describes the first phase of an institute that was designed to provide technical information to the chief administrative officials of state archival agencies about new trends in information technology and to introduce them to management tools needed for operating in this environment. Background information on the first institute…

  14. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    NASA Astrophysics Data System (ADS)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  15. Abdication or Empowerment? User Involvement in Library, Archives and Records Services

    ERIC Educational Resources Information Center

    Robinson, Leith

    2007-01-01

    User involvement in information services is a contentious issue. This article explores the participation of patrons in libraries, archives and records centres. It reviews the causes of this change, and discusses the consequences for the information profession. The article notes the constants in information environments, and concludes by suggesting…

  16. Costs and Benefits of Mission Participation in PDS4 Migrations

    NASA Astrophysics Data System (ADS)

    Mafi, J. N.; King, T. A.; Cecconi, B.; Faden, J.; Piker, C.; Kazden, D. P.; Gordon, M. K.; Joy, S. P.

    2017-12-01

    The Planetary Data System, Version 4 (PDS4) Standard, was a major reworking of the previous, PDS3 standard. According to PDS policy, "NASA missions confirmed for flight after [1 November 2011 were] required to archive their data according to PDS4 standards." Accordingly, NASA missions starting with LADEE (launched September 2013), and MAVEN (launched November 2013) have used the PDS4 standard. However, a large legacy of previously archived NASA planetary mission data already reside in the PDS archive in PDS3 and older formats. Plans to migrate the existing PDS archives to PDS4 have been discussed within PDS for some time, and have been reemphasized in the PDS Roadmap Study for 2017 - 2026 (https://pds.nasa.gov/roadmap/PlanetaryDataSystemRMS17-26_20jun17.pdf). Updating older PDS metadata to PDS4 would enable those data to take advantage of new capabilities offered by PDS4, and insure the full compatibility of past archives with current and future PDS4 tools and services. Responsibility for performing the migration to PDS4 falls primarily upon the PDS discipline nodes, though some support by the active (or recently active) instrument teams would be required in order to help augment the existing metadata to include information that is unique to PDS4. However, there may be some value in mission data providers becoming more actively involved in the migration process. The upfront costs of this approach may be offset by the long term benefits of data provider's understanding of PDS4, their ability to take more full advantage of PDS4 tools and services, and in their preparation for producing PDS4 archives for future missions. This presentation will explore the costs and benefits associated with this approach.

  17. Cassini/Huygens Program Archive Plan for Science Data

    NASA Technical Reports Server (NTRS)

    Conners, D.

    2000-01-01

    The purpose of this document is to describe the Cassini/Huygens science data archive system which includes policy, roles and responsibilities, description of science and supplementary data products or data sets, metadata, documentation, software, and archive schedule and methods for archive transfer to the NASA Planetary Data System (PDS).

  18. The COROT ground-based archive and access system

    NASA Astrophysics Data System (ADS)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  19. 77 FR 50532 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Personnel Folders or Employee Medical Folders from the National Personnel Records Center (NPRC) of the...

  20. The European HST Science Data Archive. [and Data Management Facility (DMF)

    NASA Technical Reports Server (NTRS)

    Pasian, F.; Pirenne, B.; Albrecht, R.; Russo, G.

    1993-01-01

    The paper describes the European HST Science Data Archive. Particular attention is given to the flow from the HST spacecraft to the Science Data Archive at the Space Telescope European Coordinating Facility (ST-ECF); the archiving system at the ST-ECF, including the hardware and software system structure; the operations at the ST-ECF and differences with the Data Management Facility; and the current developments. A diagram of the logical structure and data flow of the system managing the European HST Science Data Archive is included.

  1. Long-term data archiving

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, David Steven

    2009-01-01

    Long term data archiving has much value for chemists, not only to retain access to research and product development records, but also to enable new developments and new discoveries. There are some recent regulatory requirements (e.g., FDA 21 CFR Part 11), but good science and good business both benefit regardless. A particular example of the benefits of and need for long term data archiving is the management of data from spectroscopic laboratory instruments. The sheer amount of spectroscopic data is increasing at a scary rate, and the pressures to archive come from the expense to create the data (or recreatemore » it if it is lost) as well as its high information content. The goal of long-term data archiving is to save and organize instrument data files as well as any needed meta data (such as sample ID, LIMS information, operator, date, time, instrument conditions, sample type, excitation details, environmental parameters, etc.). This editorial explores the issues involved in long-term data archiving using the example of Raman spectral databases. There are at present several such databases, including common data format libraries and proprietary libraries. However, such databases and libraries should ultimately satisfy stringent criteria for long term data archiving, including readability for long times into the future, robustness to changes in computer hardware and operating systems, and use of public domain data formats. The latter criterion implies the data format should be platform independent and the tools to create the data format should be easily and publicly obtainable or developable. Several examples of attempts at spectral libraries exist, such as the ASTM ANDI format, and the JCAMP-DX format. On the other hand, proprietary library spectra can be exchanged and manipulated using proprietary tools. As the above examples have deficiencies according to the three long term data archiving criteria, Extensible Markup Language (XML; a product of the World Wide Web Consortium, an independent standards body) as a new data interchange tool is being investigated and implemented. In order to facilitate data archiving, Raman data needs calibration as well as some other kinds of data treatment. Figure 1 illustrates schematically the present situation for Raman data calibration in the world-wide Raman spectroscopy community, and presents some of the terminology used.« less

  2. NASA Earth Observing System Data and Information System (EOSDIS): A U.S. Network of Data Centers Serving Earth Science Data: A Network Member of ICSU WDS

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Ramapriyan, H. K. " Rama"

    2016-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since August 1994, and serving a diverse user community around the world with Earth science data from satellites, aircraft, field campaigns and research investigations. The ESDIS Project, responsible for EOSDIS is a Network Member of the International Council for Sciences (ICSU) World Data System (WDS). Nine of the 12 Distributed Active Archive Centers (DAACs), which are part of EOSDIS, are Regular Members of the ICSUWDS. This poster presents the EOSDIS mission objectives, key characteristics of the DAACs that make them world class Earth science data centers, successes, challenges and best practices of EOSDIS focusing on the years 2014-2016, and illustrates some highlights of accomplishments of EOSDIS. The highlights include: high customer satisfaction, growing archive and distribution volumes, exponential growth in number of products distributed to users around the world, unified metadata model and common metadata repository, flexibility provided to uses by supporting data transformations to suit their applications, near-real-time capabilities to support various operational and research applications, and full resolution image browse capabilities to help users select data of interest. The poster also illustrates how the ESDIS Project is actively involved in several US and international data system organizations.

  3. Commercial applications for optical data storage

    NASA Astrophysics Data System (ADS)

    Tas, Jeroen

    1991-03-01

    Optical data storage has spurred the market for document imaging systems. These systems are increasingly being used to electronically manage the processing, storage and retrieval of documents. Applications range from straightforward archives to sophisticated workflow management systems. The technology is developing rapidly and within a few years optical imaging facilities will be incorporated in most of the office information systems. This paper gives an overview of the status of the market, the applications and the trends of optical imaging systems.

  4. 36 CFR 1256.96 - What provisions apply to the transfer of USIA audiovisual records to the National Archives of the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... transfer of USIA audiovisual records to the National Archives of the United States? 1256.96 Section 1256.96... Information Agency Audiovisual Materials in the National Archives of the United States § 1256.96 What provisions apply to the transfer of USIA audiovisual records to the National Archives of the United States...

  5. The extreme ultraviolet explorer archive

    NASA Astrophysics Data System (ADS)

    Polomski, E.; Drake, J. J.; Dobson, C.; Christian, C.

    1993-09-01

    The Extreme Ultrviolet Explorer (EUVE) public archive was created to handle the storage, maintenance, and distribution of EUVE data and ancillary documentation, information, and software. Access to the archive became available to the public on July 17, 1992, only 40 days after the launch of the EUVE satellite. A brief overview of the archive's contents and the various methods of access will be described.

  6. "If We Just Knew Who Should Do It", or the Social Organization of the Archiving of Archaeology in Sweden

    ERIC Educational Resources Information Center

    Huvila, Isto

    2016-01-01

    Introduction: This paper analyses the work practices and perspectives of professionals working with archaeological archives and the social organization of archaeological archiving and information management in Sweden. Method: The paper is based on an interview study of Swedish actors in the field of archaeological archiving (N = 16). Analysis: The…

  7. Reconstructing a School's Past Using Oral Histories and GIS Mapping.

    ERIC Educational Resources Information Center

    Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna

    2000-01-01

    Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)

  8. Applications of hypermedia systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lennon, J.; Maurer, H.

    1995-05-01

    In this paper, we consider several new aspects of modern hypermedia systems. The applications discussed include: (1) General Information and Communication Systems: Distributed information systems for businesses, schools and universities, museums, libraries, health systems, etc. (2) Electronic orientation and information displays: Electronic guided tours, public information kiosks, and publicity dissemination with archive facilities. (3) Lecturing: A system going beyond the traditional to empower both teachers and learners. (4) Libraries: A further step towards fully electronic library systems. (5) Directories of all kinds: Staff, telephone, and all sorts of generic directories. (6) Administration: A fully integrated system such as the onemore » proposed will mean efficient data processing and valuable statistical data. (7) Research: Material can now be accessed from databases all around the world. The effects of networking and computer-supported collaborative work are discussed, and examples of new scientific visualization programs are quoted. The paper concludes with a section entitled {open_quotes}Future Directions{close_quotes}.« less

  9. Sustainability Science Needs Sustainable Data!

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Sustainability science (SS) is an 'emerging field of research dealing with the interactions between natural and social systems, and with how those interactions affect the challenge of sustainability: meeting the needs of present and future generations while substantially reducing poverty and conserving the planet's life support systems' (Kates, 2011; Clark, 2007). Bettencourt & Kaur (2011) identified more than 20,000 scientific papers published on SS topics since the 1980s with more than 35,000 distinct authors. They estimated that the field is currently growing exponentially, with the number of authors doubling approximately every 8 years. These scholars are undoubtedly using and generating a vast quantity and variety of data and information for both SS research and applications. Unfortunately we know little about what data the SS community is actually using, and whether or not the data that SS scholars generate are being preserved for future use. Moreover, since much SS research is conducted by cross-disciplinary, multi-institutional teams, often scattered around the world, there could well be increased risks of data loss, reduced data quality, inadequate documentation, and poor long-term access and usability. Capabilities and processes therefore need to be established today to support continual, reliable, and efficient preservation of and access to SS data in the future, especially so that they can be reused in conjunction with future data and for new studies not conceived in the original data collection activities. Today's long-term data stewardship challenges include establishing sustainable data governance to facilitate continuing management, selecting data to ensure that limited resources are focused on high priority SS data holdings, securing sufficient rights to allow unforeseen uses, and preparing data to enable use by future communities whose specific research and information needs are not yet known. Adopting sustainable models for archival infrastructures will reduce dependencies on changing priorities and sponsorship that may not continue. Implementing community-based appraisal criteria and selection procedures for data will ensure that limited resources for long-term data management are applied efficiently to data likely to have the most enduring value. Encouraging producers to provide rights for open access to data will support their replication, reuse, integration, and application in a range of SS research and applications in both the near and long term. Identifying modest changes to current data preparation activities to meet preservation goals should reduce expensive post-hoc data and documentation rescue efforts. The NASA Socioeconomic Data and Applications Center (SEDAC), an active archive in the NASA Earth Observing System Data and Information System (EOSDIS), established the SEDAC Long-Term Archive (LTA) in collaboration with the Columbia University Libraries to preserve selected data and information resources for future access and use. A case study of the LTA shows how archives can be organized to foster sustainable data stewardship in a university environment. Lessons learned from the organization planning and the preparation, appraisal, and selection of data for the LTA are described along with enhancements that have been applied to data management by the active archive.

  10. A medical application integrating remote 3D visualization tools to access picture archiving and communication system on mobile devices.

    PubMed

    He, Longjun; Ming, Xing; Liu, Qian

    2014-04-01

    With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. However, for direct interactive 3D visualization, which plays an important role in radiological diagnosis, the mobile device cannot provide a satisfactory quality of experience for radiologists. This paper developed a medical system that can get medical images from the picture archiving and communication system on the mobile device over the wireless network. In the proposed application, the mobile device got patient information and medical images through a proxy server connecting to the PACS server. Meanwhile, the proxy server integrated a range of 3D visualization techniques, including maximum intensity projection, multi-planar reconstruction and direct volume rendering, to providing shape, brightness, depth and location information generated from the original sectional images for radiologists. Furthermore, an algorithm that changes remote render parameters automatically to adapt to the network status was employed to improve the quality of experience. Finally, performance issues regarding the remote 3D visualization of the medical images over the wireless network of the proposed application were also discussed. The results demonstrated that this proposed medical application could provide a smooth interactive experience in the WLAN and 3G networks.

  11. The National Center for Atmospheric Research (NCAR) Research Data Archive: a Data Education Center

    NASA Astrophysics Data System (ADS)

    Peng, G. S.; Schuster, D.

    2015-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA), rda.ucar.edu, is not just another data center or data archive. It is a data education center. We not only serve data, we TEACH data. Weather and climate data is the original "Big Data" dataset and lessons learned while playing with weather data are applicable to a wide range of data investigations. Erroneous data assumptions are the Achilles heel of Big Data. It doesn't matter how much data you crunch if the data is not what you think it is. Each dataset archived at the RDA is assigned to a data specialist (DS) who curates the data. If a user has a question not answered in the dataset information web pages, they can call or email a skilled DS for further clarification. The RDA's diverse staff—with academic training in meteorology, oceanography, engineering (electrical, civil, ocean and database), mathematics, physics, chemistry and information science—means we likely have someone who "speaks your language." Data discovery is another difficult Big Data problem; one can only solve problems with data if one can find the right data. Metadata, both machine and human-generated, underpin the RDA data search tools. Users can quickly find datasets by name or dataset ID number. They can also perform a faceted search that successively narrows the options by user requirements or simply kick off an indexed search with a few words. Weather data formats can be difficult to read for non-expert users; it's usually packed in binary formats requiring specialized software and parameter names use specialized vocabularies. DSs create detailed information pages for each dataset and maintain lists of helpful software, documentation and links of information around the web. We further grow the level of sophistication of the users with tips, tutorials and data stories on the RDA Blog, http://ncarrda.blogspot.com/. How-to video tutorials are also posted on the NCAR Computational and Information Systems Laboratory (CISL) YouTube channel.

  12. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Picture archiving and communications system. 892.2050 Section 892.2050 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... communications system. (a) Identification. A picture archiving and communications system is a device that...

  13. Adaptation of industry standards to PACS

    NASA Astrophysics Data System (ADS)

    Lee, Joseph K.; Yin, Lloyd; Huang, H. K.; Wong, Albert W. K.

    1994-05-01

    Imagery and textual communications among healthcare information systems, medical imaging equipment, and picture archiving and communication systems (PACS) have always been difficult as each of these components varies with platforms, modalities, and manufacturers. With the emerging of industry standards, it become feasible to integrate all these heterogeneous, disparate medical images and textual data. This paper describes two such major industry standards: Health Level 7 (HL7) and ACR/NEMA. In conforming to the HL7 standard, we are able to share medical information between the hospital information systems, radiology information systems, and PACS. By adapting the ACR/NEMA 2.0 standard, we also can convert medical images generated from a variety of modalities and manufacturers to its standardized data format. The conversion is based on the data dictionary defined in the ACR/NEMA 2.0 document.

  14. 78 FR 64024 - National Industrial Security Program Policy Advisory Committee (NISPPAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office [NARA-2014-001] National Industrial Security Program Policy Advisory Committee (NISPPAC) AGENCY: National Archives and... submitted to the Information Security Oversight Office (ISOO) no later than Friday, November 8, 2013. ISOO...

  15. 75 FR 69473 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... effectiveness of the Public Vaults and its several exhibits in enhancing visitors' understanding that records...

  16. Development of Secondary Archive System at Goddard Space Flight Center Version 0 Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Sherman, Mark; Kodis, John; Bedet, Jean-Jacques; Wacker, Chris; Woytek, Joanne; Lynnes, Chris

    1996-01-01

    The Goddard Space Flight Center (GSFC) version 0 Distributed Active Archive Center (DAAC) has been developed to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test EOS data and information system (EOSDIS) concepts. To ensure that no data is ever lost, each product received at GSFC DAAC is archived on two different media, VHS and digital linear tape (DLT). The first copy is made on VHS tape and is under the control of UniTree. The second and third copies are made to DLT and VHS media under a custom built software package named 'Archer'. While Archer provides only a subset of the functions available with commercial software like UniTree, it supports migration between near-line and off-line media and offers much greater performance and flexibility to satisfy the specific needs of a data center. Archer is specifically designed to maximize total system throughput, rather than focusing on the turn-around time for individual files. The commercial off the shelf software (COTS) hierarchical storage management (HSM) products evaluated were mainly concerned with transparent, interactive, file access to the end-user, rather than a batch-orientated, optimizable (based on known data file characteristics) data archive and retrieval system. This is critical to the distribution requirements of the GSFC DAAC where orders for 5000 or more files at a time are received. Archer has the ability to queue many thousands of file requests and to sort these requests into internal processing schedules that optimize overall throughput. Specifically, mount and dismount, tape load and unload cycles, and tape motion are minimized. This feature did not seem to be available in many COTS pacages. Archer also uses a generic tar tape format that allows tapes to be read by many different systems rather than the proprietary format found in most COTS packages. This paper discusses some of the specific requirements at GSFC DAAC, the motivations for implementing the Archer system, and presents a discussion of the Archer design that resulted.

  17. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  18. Sky Event Reporting Metadata (VOEvent) Version 2.0

    NASA Technical Reports Server (NTRS)

    Seaman, Rob; Williams, Roy; Allan, Alasdair; Barthelmy, Scott; Bloom, Joshua S.; Brewer, John M.; Denny, Robert B.; Fitzpatrick, Mike; Graham, Matthew; Gray, Norman; hide

    2011-01-01

    VOEvent [20] defines the content and meaning of a standard information packet for representing, transmitting, publishing and archiving information about a transient celestial event, with the implication that timely follow-up is of interest. The objective is to motivate the observation of targets-of-opportunity, to drive robotic telescopes, to trigger archive searches, and to alert the community. VOEvent is focused on the reporting of photon events, but events mediated by disparate phenomena such as neutrinos, gravitational waves, and solar or atmospheric particle bursts may also be reported. Structured data is used, rather than natural language, so that automated systems can effectively interpret VOEvent packets. Each packet may contain zero or more of the "who, what, where, when & how" of a detected event, but in addition, may contain a hypothesis (a "why") regarding the nature of the underlying physical cause of the event.

  19. Internet based ECG medical information system.

    PubMed

    James, D A; Rowlands, D; Mahnovetski, R; Channells, J; Cutmore, T

    2003-03-01

    Physiological monitoring of humans for medical applications is well established and ready to be adapted to the Internet. This paper describes the implementation of a Medical Information System (MIS-ECG system) incorporating an Internet based ECG acquisition device. Traditionally clinical monitoring of ECG is largely a labour intensive process with data being typically stored on paper. Until recently, ECG monitoring applications have also been constrained somewhat by the size of the equipment required. Today's technology enables large and fixed hospital monitoring systems to be replaced by small portable devices. With an increasing emphasis on health management a truly integrated information system for the acquisition, analysis, patient particulars and archiving is now a realistic possibility. This paper describes recent Internet and technological advances and presents the design and testing of the MIS-ECG system that utilises those advances.

  20. Recommendations concerning satellite-acquired earth resource data: 1982 report of the Data Management Subcommittee of the GEOSAT Committee, Incorporated

    NASA Technical Reports Server (NTRS)

    1982-01-01

    End user concerns about the content and accessibility of libraries of remote sensing data in general are addressed. Recommendations pertaining to the United States' satellite remote sensing programs urge: (1) the continuation of the NASA/EROS Data Center program to convert pre-1979 scenes to computer readable tapes and create a historical archive of this valuable data; (2) improving the EROS archive by adding geologically interesting scenes, data from other agencies (including previously classified data), and by adopting a policy to retire data from the archive; (3) establishing a computer data base inquiry system that includes remote sensing data from all publically available sources; (4) capability for prepurchase review and evaluation; (5) a flexible price structure; and (6) adoption of standard digital data products format. Information about LANDSAT 4, the status of worldwide LANDSAT receiving stations, future non-U.S. remote sensing satellites, a list of sources for LANDSAT data, and the results of a survey of GEOSAT members' remote sensing data processing systems are also considered.

  1. Archive of digital Chirp subbottom profile data collected during USGS cruises 09CCT03 and 09CCT04, Mississippi and Alabama Gulf Islands, June and July 2009

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2011-01-01

    In June and July of 2009, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Cat Island, Mississippi, to Dauphin Island, Alabama, as part of a broader USGS study on Coastal Change and Transport (CCT). The surveys were funded through the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php). This report serves as an archive of unprocessed digital Chirp seismic profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Single-beam and Swath bathymetry data were also collected during these cruises and will be published as a separate archive. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  2. International Federation of Library Associations Annual Conference. Papers of the Management and Technology Division: Information Technology Section (47th, Leipzig, East Germany, August 17-22, 1981).

    ERIC Educational Resources Information Center

    Bradler, Reinhard; And Others

    These seven papers on library management and networks focus on: (1) computerized access to archival and library materials, describing the methodological problems associated with a pilot project in the German Democratic Republic, as well as the efficiency of data bank systems; (2) present and future development of libraries and information centers…

  3. Commercial imagery archive product development

    NASA Astrophysics Data System (ADS)

    Sakkas, Alysa

    1999-12-01

    The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.

  4. Complementary concept for an image archive and communication system in a cardiological department based on CD-medical, an online archive, and networking facilities

    NASA Astrophysics Data System (ADS)

    Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart

    1998-07-01

    The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.

  5. Using dCache in Archiving Systems oriented to Earth Observation

    NASA Astrophysics Data System (ADS)

    Garcia Gil, I.; Perez Moreno, R.; Perez Navarro, O.; Platania, V.; Ozerov, D.; Leone, R.

    2012-04-01

    The object of LAST activity (Long term data Archive Study on new Technologies) is to perform an independent study on best practices and assessment of different archiving technologies mature for operation in the short and mid-term time frame, or available in the long-term with emphasis on technologies better suited to satisfy the requirements of ESA, LTDP and other European and Canadian EO partners in terms of digital information preservation and data accessibility and exploitation. During the last phase of the project, a testing of several archiving solutions has been performed in order to evaluate their suitability. In particular, dCache, aimed to provide a file system tree view of the data repository exchanging this data with backend (tertiary) Storage Systems as well as space management, pool attraction, dataset replication, hot spot determination and recovery from disk or node failures. Connected to a tertiary storage system, dCache simulates unlimited direct access storage space. Data exchanges to and from the underlying HSM are performed automatically and invisibly to the user Dcache was created to solve the requirements of big computer centers and universities with big amounts of data, putting their efforts together and founding EMI (European Middleware Initiative). At the moment being, Dcache is mature enough to be implemented, being used by several research centers of relevance (e.g. LHC storing up to 50TB/day). This solution has been not used so far in Earth Observation and the results of the study are summarized in this article, focusing on the capacities over a simulated environment to get in line with the ESA requirements for a geographically distributed storage. The challenge of a geographically distributed storage system can be summarized as the way to provide a maximum quality for storage and dissemination services with the minimum cost.

  6. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  7. 47 CFR 80.179 - Unattended operation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DSC in accordance with ITU-R Recommendation M.493-11, “Digital Selective-calling System for Use in the...., Washington, DC (Reference Information Center) or at the National Archives and Records Administration (NARA... condition related to ship safety. (3) The “ROUTINE” DSC category must be used. (4) Communications must be...

  8. Research notes : developing corridor-level truck travel time estimates and other freight performance measures from archived ITS data.

    DOT National Transportation Integrated Search

    2010-02-01

    This research project demonstrated that it is feasible to use the WIM data to develop long-term corridor performance monitoring of truck travel. From the perspective of a realtime traveler information system, there are too many shortcomings mainl...

  9. Internet-Based Cervical Cytology Screening System

    DTIC Science & Technology

    2007-04-01

    approaches to cervical cancer screening possible. In addition, advances in information technology have facilitated the Internet transmission and archival...processes in the clinical laboratory. Recent technological advances in specimen preparation and computerized primary screening make automated...AD_________________ Award Number: W81XWH-04-C-0083 TITLE: Internet -Based Cervical Cytology

  10. 33 CFR 164.03 - Incorporation by reference.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... inspection at the Navigation Systems Division (CG-553), Coast Guard Headquarters, 2100 2nd St. SW., Stop 7580, Washington, DC 20593-7580 and at the National Archives and Records Administration (NARA). For information on... Testing Fiber Ropes 164.74 Cordage Institute, 350 Lincoln Street, Hingham, MA 02043 CIA-3, Standard Test...

  11. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  12. Researching the Vietnam Conflict through U.S. Archival Sources.

    ERIC Educational Resources Information Center

    Marlatt, Greta E.

    1995-01-01

    Presents a pathfinder for the researcher interested in locating materials pertaining to the U.S. involvement in the Vietnam conflict. Highlight include historical background; information sources, including the National Archives, oral histories, manuscripts, federal reports, National Technical Information Service, and special collections. Seven…

  13. 78 FR 38077 - National Industrial Security Program Policy Advisory Committee (NISPPAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-25

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office [NARA-13-0030] National Industrial Security Program Policy Advisory Committee (NISPPAC) AGENCY: National Archives and... submitted to the Information Security Oversight Office (ISOO) no later than Friday, July 12, 2013. ISOO will...

  14. The Path from Large Earth Science Datasets to Information

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.

    2013-12-01

    The NASA Goddard Earth Sciences Data (GES) and Information Services Center (DISC) is one of the major Science Mission Directorate (SMD) for archiving and distribution of Earth Science remote sensing data, products and services. This virtual portal provides convenient access to Atmospheric Composition and Dynamics, Hydrology, Precipitation, Ozone, and model derived datasets (generated by GSFC's Global Modeling and Assimilation Office), the North American Land Data Assimilation System (NLDAS) and the Global Land Data Assimilation System (GLDAS) data products (both generated by GSFC's Hydrological Sciences Branch). This presentation demonstrates various tools and computational technologies developed in the GES DISC to manage the huge volume of data and products acquired from various missions and programs over the years. It explores approaches to archive, document, distribute, access and analyze Earth Science data and information as well as addresses the technical and scientific issues, governance and user support problem faced by scientists in need of multi-disciplinary datasets. It also discusses data and product metrics, user distribution profiles and lessons learned through interactions with the science communities around the world. Finally it demonstrates some of the most used data and product visualization and analyses tools developed and maintained by the GES DISC.

  15. Information Model Translation to Support a Wider Science Community

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Ritschel, Bernd; Hardman, Sean; Joyner, Ronald

    2014-05-01

    The Planetary Data System (PDS), NASA's long-term archive for solar system exploration data, has just released PDS4, a modernization of the PDS architecture, data standards, and technical infrastructure. This next generation system positions the PDS to meet the demands of the coming decade, including big data, international cooperation, distributed nodes, and multiple ways of analysing and interpreting data. It also addresses three fundamental project goals: providing more efficient data delivery by data providers to the PDS, enabling a stable, long-term usable planetary science data archive, and enabling services for the data consumer to find, access, and use the data they require in contemporary data formats. The PDS4 information architecture is used to describe all PDS data using a common model. Captured in an ontology modeling tool it supports a hierarchy of data dictionaries built to the ISO/IEC 11179 standard and is designed to increase flexibility, enable complex searches at the product level, and to promote interoperability that facilitates data sharing both nationally and internationally. A PDS4 information architecture design requirement stipulates that the content of the information model must be translatable to external data definition languages such as XML Schema, XMI/XML, and RDF/XML. To support the semantic Web standards we are now in the process of mapping the contents into RDF/XML to support SPARQL capable databases. We are also building a terminological ontology to support virtually unified data retrieval and access. This paper will provide an overview of the PDS4 information architecture focusing on its domain information model and how the translation and mapping are being accomplished.

  16. The new space and earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, ozone TOMS data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  17. The new space and Earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, Total Ozone Mapping Spectrometer (TOMS) data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  18. Data archiving and serving system implementation in CLEP's GRAS Core System

    NASA Astrophysics Data System (ADS)

    Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai

    2017-04-01

    The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.

  19. Science Archives in the 21st Century: A NASA LAMBDA Report

    NASA Technical Reports Server (NTRS)

    Butterworth, P.; Greason, M.

    2007-01-01

    Lambda is a thematic data center that focuses on serving the cosmic microwave background (CMB) research community. LAMBDA is an active archive for NASA's Cosmic Background Explorer (COBE) and Wilkinson Microwave Anisotropy Probe (WMAP) mission data sets. In addition, LAMBDA provides analysis software, on-line tools, relevant ancillary data and important web links. LAMBDA also tries to preserve the most important ground-based and suborbital CMB data sets. CMB data is unlike other astrophysical data, consisting of intrinsically diffuse surface brightness photometry with a signal contrast of the order 1 part in 100,000 relative to the uniform background. Because of the extremely faint signal levels, the signal-to-noise ratio is relatively low and detailed instrument-specific knowledge of the data is essential. While the number of data sets being produced is not especially large, those data sets are becoming large and complex. That tendency will increase when the many polarization experiments currently being deployed begin producing data. The LAMBDA experience supports many aspects of the NASA data archive model developed informally over the last ten years-that small focused data centers are often more effective than larger more ambitious collections, for example; that data centers are usually best run by active scientists; that it can be particularly advantageous if those scientists are leaders in the use of the archived data sets; etc. LAMBDA has done some things so well that they might provide lessons for other archives. A lot of effort has been devoted to developing a simple and consistent interface to data sets, for example; and serving all the documentation required via simple 'more' pages and longer explanatory supplements. Many of the problems faced by LAMBDA will also not surprise anyone trying to manage other space science data. These range from persuading mission scientists to provide their data as quickly as possible, to dealing with a high volume of nuisance (spam) messages. Because so many data center problems and solutions are common across individual data centers and disciplines it would be very valuable to establish some new systems of communication - such as informal email lists for administrators and developers. But resources are very limited, so new timeconsuming and inefficient mechanisms - like too-frequent and too-structured meetingsshould be avoided. Although there are great advantages to being small, agile and independent, there are also some areas where science data centers within and without NASA could be better coordinated - for the assignment of persistent identifiers; to encourage the early adoption of useful standards and technologies; etc. Some super-structure to facilitate such coordination might be beneficial as long as it doesn't begin to control the other work of the archives, and become a "methodology police". In this respect the CCSDS "Reference Model for an Open Archive Information System" is a little worrying. It may be that the closer a data center gets to following such a detailed prescription, the less effective it will become. It is much better to have an informal coordination process than a bureaucratic straight-jacket.

  20. Picture Archiving and Communication System (PACS) implementation, integration & benefits in an integrated health system.

    PubMed

    Mansoori, Bahar; Erhard, Karen K; Sunshine, Jeffrey L

    2012-02-01

    The availability of the Picture Archiving and Communication System (PACS) has revolutionized the practice of radiology in the past two decades and has shown to eventually increase productivity in radiology and medicine. PACS implementation and integration may bring along numerous unexpected issues, particularly in a large-scale enterprise. To achieve a successful PACS implementation, identifying the critical success and failure factors is essential. This article provides an overview of the process of implementing and integrating PACS in a comprehensive health system comprising an academic core hospital and numerous community hospitals. Important issues are addressed, touching all stages from planning to operation and training. The impact of an enterprise-wide radiology information system and PACS at the academic medical center (four specialty hospitals), in six additional community hospitals, and in all associated outpatient clinics as well as the implications on the productivity and efficiency of the entire enterprise are presented. Copyright © 2012 AUR. Published by Elsevier Inc. All rights reserved.

  1. Cognition-based development and evaluation of ergonomic user interfaces for medical image processing and archiving systems.

    PubMed

    Demiris, A M; Meinzer, H P

    1997-01-01

    Whether or not a computerized system enhances the conditions of work in the application domain, very much demands on the user interface. Graphical user interfaces seem to attract the interest of the users but mostly ignore some basic rules of visual information processing thus leading to systems which are difficult to use, lowering productivity and increasing working stress (cognitive and work load). In this work we present some fundamental ergonomic considerations and their application to the medical image processing and archiving domain. We introduce the extensions to an existing concept needed to control and guide the development of GUIs with respect to domain specific ergonomics. The suggested concept, called Model-View-Controller Constraints (MVCC), can be used to programmatically implement ergonomic constraints, and thus has some advantages over written style guides. We conclude with the presentation of existing norms and methods to evaluate user interfaces.

  2. The last Deglaciation in the Mediterranean region: a multi-archives synthesis

    NASA Astrophysics Data System (ADS)

    Bazin, Lucie; Siani, Giuseppe; Landais, Amaelle; Bassinot, Frank; Genty, Dominique; Govin, Aline; Michel, Elisabeth; Nomade, Sebastien; Waelbroeck, Claire

    2016-04-01

    Multiple proxies record past climatic changes in different climate archives. These proxies are influenced by different component of the climate system and bring complementary information on past climate variability. The major limitation when combining proxies from different archives comes from the coherency of their chronologies. Indeed, each climate archives possess their own dating methods, not necessarily coherent with each other's. Consequently, when we want to assess the latitudinal changes and mechanisms behind a climate event, we often have to rely on assumptions of synchronisation between the different archives, such as synchronous temperature changes during warming events (Austin and Hibbert 2010). Recently, a dating method originally developed to produce coherent chronologies for ice cores (Datice,Lemieux-Dudon et al., 2010) has been adapted in order to integrate different climate archives (ice cores, sediment cores and speleothems (Lemieux-Dudon et al., 2015, Bazin et al., in prep)). In this presentation we present the validation of this multi-archives dating tool with a first application covering the last Deglaciation in the Mediterranean region. For this experiment, we consider the records from Monticchio, the MD90-917, Tenaghi Philippon and Lake Orhid sediment cores as well as continuous speleothems from Sofular, Soreq and La Mine caves. Using the Datice dating tool, and with the identification of common tephra layers between the cores considered, we are able to produce a multi-archives coherent chronology for this region, independently of any climatic assumption. Using this common chronological framework, we show that the usual climatic synchronisation assumptions are not valid over this region for the last glacial-interglacial transition. Finally, we compare our coherent Mediterranean chronology with Greenland ice core records in order to discuss the sequence of events of the last Deglaciation between these two regions.

  3. Space and Earth Science Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1991-01-01

    The workshop explored opportunities for data compression to enhance the collection and analysis of space and Earth science data. The focus was on scientists' data requirements, as well as constraints imposed by the data collection, transmission, distribution, and archival systems. The workshop consisted of several invited papers; two described information systems for space and Earth science data, four depicted analysis scenarios for extracting information of scientific interest from data collected by Earth orbiting and deep space platforms, and a final one was a general tutorial on image data compression.

  4. 77 FR 60475 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-03

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Privacy Act of 1974, as Amended; System of Records Notices AGENCY: National Archives and Records Administration (NARA). ACTION: Notice of the establishment of new privacy system of record, NARA 44. SUMMARY: The National Archives and Records Administration...

  5. 76 FR 13671 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Privacy Act of 1974, as Amended; System of Records Notices AGENCY: National Archives and Records Administration (NARA). ACTION: Notice of the establishment of new privacy system of record, NARA 41. SUMMARY: The National Archives and Records Administration...

  6. Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.

    PubMed

    Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel

    2012-01-01

    Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012

  7. Parlaying digital imaging and communications in medicine and open architecture to our advantage: the new Department of Defense picture archiving and communications system.

    PubMed

    Cawthon, M A

    1999-05-01

    The Department of Defense (DoD) undertook a major systems specification, acquisition, and implementation project of multivendor picture archiving and communications system (PACS) and teleradiology systems during 1997 with deployment of the first systems in 1998. These systems differ from their DoD predecessor system in being multivendor in origin, specifying adherence to the developing Digital Imaging and Communications in Medicine (DICOM) 3.0 standard and all of its service classes, emphasizing open architecture, using personal computer (PC) and web-based image viewing access, having radiologic telepresence over large geographic areas as a primary focus of implementation, and requiring bidirectional interfacing with the DoD hospital information system (HIS). The benefits and advantages to the military health-care system accrue through the enabling of a seamless implementation of a virtual radiology operational environment throughout this vast healthcare organization providing efficient general and subspecialty radiologic interpretive and consultative services for our medical beneficiaries to any healthcare provider, anywhere and at any time of the night or day.

  8. Standardizing the nomenclature of Martian impact crater ejecta morphologies

    USGS Publications Warehouse

    Barlow, Nadine G.; Boyce, Joseph M.; Costard, Francois M.; Craddock, Robert A.; Garvin, James B.; Sakimoto, Susan E.H.; Kuzmin, Ruslan O.; Roddy, David J.; Soderblom, Laurence A.

    2000-01-01

    The Mars Crater Morphology Consortium recommends the use of a standardized nomenclature system when discussing Martian impact crater ejecta morphologies. The system utilizes nongenetic descriptors to identify the various ejecta morphologies seen on Mars. This system is designed to facilitate communication and collaboration between researchers. Crater morphology databases will be archived through the U.S. Geological Survey in Flagstaff, where a comprehensive catalog of Martian crater morphologic information will be maintained.

  9. Approach to Managing MeaSURES Data at the GSFC Earth Science Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Vollmer, Bruce; Kempler, Steven J.; Ramapriyan, Hampapuram K.

    2009-01-01

    A major need stated by the NASA Earth science research strategy is to develop long-term, consistent, and calibrated data and products that are valid across multiple missions and satellite sensors. (NASA Solicitation for Making Earth System data records for Use in Research Environments (MEaSUREs) 2006-2010) Selected projects create long term records of a given parameter, called Earth Science Data Records (ESDRs), based on mature algorithms that bring together continuous multi-sensor data. ESDRs, associated algorithms, vetted by the appropriate community, are archived at a NASA affiliated data center for archive, stewardship, and distribution. See http://measures-projects.gsfc.nasa.gov/ for more details. This presentation describes the NASA GSFC Earth Science Data and Information Services Center (GES DISC) approach to managing the MEaSUREs ESDR datasets assigned to GES DISC. (Energy/water cycle related and atmospheric composition ESDRs) GES DISC will utilize its experience to integrate existing and proven reusable data management components to accommodate the new ESDRs. Components include a data archive system (S4PA), a data discovery and access system (Mirador), and various web services for data access. In addition, if determined to be useful to the user community, the Giovanni data exploration tool will be made available to ESDRs. The GES DISC data integration methodology to be used for the MEaSUREs datasets is presented. The goals of this presentation are to share an approach to ESDR integration, and initiate discussions amongst the data centers, data managers and data providers for the purpose of gaining efficiencies in data management for MEaSUREs projects.

  10. The ISO Data Archive and Interoperability with Other Archives

    NASA Astrophysics Data System (ADS)

    Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro

    The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.

  11. Mission operations update for the restructured Earth Observing System (EOS) mission

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita Castro; Chang, Edward S.

    1993-01-01

    The National Aeronautics and Space Administration's (NASA) Earth Observing System (EOS) will provide a comprehensive long term set of observations of the Earth to the Earth science research community. The data will aid in determining global changes caused both naturally and through human interaction. Understanding man's impact on the global environment will allow sound policy decisions to be made to protect our future. EOS is a major component of the Mission to Planet Earth program, which is NASA's contribution to the U.S. Global Change Research Program. EOS consists of numerous instruments on multiple spacecraft and a distributed ground system. The EOS Data and Information System (EOSDIS) is the major ground system developed to support EOS. The EOSDIS will provide EOS spacecraft command and control, data processing, product generation, and data archival and distribution services for EOS spacecraft. Data from EOS instruments on other Earth science missions (e.g., Tropical Rainfall Measuring Mission (TRMM)) will also be processed, distributed, and archived in EOSDIS. The U.S. and various International Partners (IP) (e.g., the European Space Agency (ESA), the Ministry of International Trade and Industry (MITI) of Japan, and the Canadian Space Agency (CSA)) participate in and contribute to the international EOS program. The EOSDIS will also archive processed data from other designated NASA Earth science missions (e.g., UARS) that are under the broad umbrella of Mission to Planet Earth.

  12. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  13. An optical disk archive for a data base management system

    NASA Technical Reports Server (NTRS)

    Thomas, Douglas T.

    1985-01-01

    An overview is given of a data base management system that can catalog and archive data at rates up to 50M bits/sec. Emphasis is on the laser disk system that is used for the archive. All key components in the system (3 Vax 11/780s, a SEL 32/2750, a high speed communication interface, and the optical disk) are interfaced to a 100M bits/sec 16-port fiber optic bus to achieve the high data rates. The basic data unit is an autonomous data packet. Each packet contains a primary and secondary header and can be up to a million bits in length. The data packets are recorded on the optical disk at the same time the packet headers are being used by the relational data base management software ORACLE to create a directory independent of the packet recording process. The user then interfaces to the VAX that contains the directory for a quick-look scan or retrieval of the packet(s). The total system functions are distributed between the VAX and the SEL. The optical disk unit records the data with an argon laser at 100M bits/sec from its buffer, which is interfaced to the fiber optic bus. The same laser is used in the read cycle by reducing the laser power. Additional information is given in the form of outlines, charts, and diagrams.

  14. 78 FR 64254 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2014-002] Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice. SUMMARY: NARA is giving public notice that the agency has submitted to OMB for...

  15. 76 FR 7591 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-10

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... in a timely fashion the volume of requests received for these records and the need to obtain specific...

  16. 75 FR 66166 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... appropriate NARA research room or who request copies of records as a result of visiting a research room. NARA...

  17. A New Archive and Internet Search Engine May Change the Nature of On-Line Research.

    ERIC Educational Resources Information Center

    Selingo, Jeffrey

    1998-01-01

    In the process of trying to preserve Internet history by archiving it, a company has developed a powerful Internet search engine that provides information on Web site usage patterns, which can act as a relatively objective source of information about information sources and can link sources that a researcher might otherwise miss. However, issues…

  18. Advanced radiology information system.

    PubMed

    Kolovou, L; Vatousi, M; Lymperopoulos, D; Koukias, M

    2005-01-01

    The innovative features of an advanced Radiology Information System (RIS) are presented in this paper. The interoperability of RIS with the other Intra-hospital Information Systems that interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms [1]. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol's specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The same mechanism allows the secure and HL7-compatible interactions with the Hospital Information System (HIS) too. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS). The whole structure ensures the automation of the every-day procedures that the ;medical protocol' specifies and provides its services through a friendly and easy to manage graphical user interface.

  19. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  20. No Longer Have to Choose

    NASA Astrophysics Data System (ADS)

    Brown, H.; Ritchey, N. A.

    2017-12-01

    NOAA National Centers for Environmental Information (NCEI) once was three separate data centers (NGDC, NODC, and NCDC). In 2015 the three centers merged into NCEI. NCEI has refined the art of long term preservation and stewardship practices throughout the life-cycle of various types of data. NCEI can help you navigate and make the complicated world of preserving your data user-friendly. Using tools at NCEI, data providers can request data to be archived, submit data for archival and create complete International Organization for Standardization (ISO) metadata records with ease. To ensure traceability, Digital Object Identifiers (DOIs) are minted for published data sets. The services offered at NCEI follow standards and NOAA directives such as the Open Archival Information System (OAIS) - Reference Model (ISO 14721) to ensure consistent long-term preservation for the Nation's resource of global environmental data for a broad spectrum of users. The implementation of these standards supports the data to be accessible, independently understandable and reproducible in an easy to understand format for all types of users. Insights from combined knowledge of 100+years of various domain and data management and preservation and the tools supporting these functions will be shared.

  1. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  2. Building a COTS archive for satellite data

    NASA Technical Reports Server (NTRS)

    Singer, Ken; Terril, Dave; Kelly, Jack; Nichols, Cathy

    1994-01-01

    The goal of the NOAA/NESDIS Active Archive was to provide a method of access to an online archive of satellite data. The archive had to manage and store the data, let users interrogate the archive, and allow users to retrieve data from the archive. Practical issues of the system design such as implementation time, cost and operational support were examined in addition to the technical issues. There was a fixed window of opportunity to create an operational system, along with budget and staffing constraints. Therefore, the technical solution had to be designed and implemented subject to constraint imposed by the practical issues. The NOAA/NESDIS Active Archive came online in July of 1994, meeting all of its original objectives.

  3. Automating Data Submission to a National Archive

    NASA Astrophysics Data System (ADS)

    Work, T. T.; Chandler, C. L.; Groman, R. C.; Allison, M. D.; Gegg, S. R.; Biological; Chemical Oceanography Data Management Office

    2010-12-01

    In late 2006, the U.S. National Science Foundation (NSF) funded the Biological and Chemical Oceanographic Data Management Office (BCO-DMO) at Woods Hole Oceanographic Institution (WHOI) to work closely with investigators to manage oceanographic data generated from their research projects. One of the final data management tasks is to ensure that the data are permanently archived at the U.S. National Oceanographic Data Center (NODC) or other appropriate national archiving facility. In the past, BCO-DMO submitted data to NODC as an email with attachments including a PDF file (a manually completed metadata record) and one or more data files. This method is no longer feasible given the rate at which data sets are contributed to BCO-DMO. Working with collaborators at NODC, a more streamlined and automated workflow was developed to keep up with the increased volume of data that must be archived at NODC. We will describe our new workflow; a semi-automated approach for contributing data to NODC that includes a Federal Geographic Data Committee (FGDC) compliant Extensible Markup Language (XML) metadata file accompanied by comma-delimited data files. The FGDC XML file is populated from information stored in a MySQL database. A crosswalk described by an Extensible Stylesheet Language Transformation (XSLT) is used to transform the XML formatted MySQL result set to a FGDC compliant XML metadata file. To ensure data integrity, the MD5 algorithm is used to generate a checksum and manifest of the files submitted to NODC for permanent archive. The revised system supports preparation of detailed, standards-compliant metadata that facilitate data sharing and enable accurate reuse of multidisciplinary information. The approach is generic enough to be adapted for use by other data management groups.

  4. Determining the Completeness of the Nimbus Meteorological Data Archive

    NASA Technical Reports Server (NTRS)

    Johnson, James; Moses, John; Kempler, Steven; Zamkoff, Emily; Al-Jazrawi, Atheer; Gerasimov, Irina; Trivedi, Bhagirath

    2011-01-01

    NASA launched the Nimbus series of meteorological satellites in the 1960s and 70s. These satellites carried instruments for making observations of the Earth in the visible, infrared, ultraviolet, and microwave wavelengths. The original data archive consisted of a combination of digital data written to 7-track computer tapes and on various film media. Many of these data sets are now being migrated from the old media to the GES DISC modern online archive. The process involves recovering the digital data files from tape as well as scanning images of the data from film strips. Some of the challenges of archiving the Nimbus data include the lack of any metadata from these old data sets. Metadata standards and self-describing data files did not exist at that time, and files were written on now obsolete hardware systems and outdated file formats. This requires creating metadata by reading the contents of the old data files. Some digital data files were corrupted over time, or were possibly improperly copied at the time of creation. Thus there are data gaps in the collections. The film strips were stored in boxes and are now being scanned as JPEG-2000 images. The only information describing these images is what was written on them when they were originally created, and sometimes this information is incomplete or missing. We have the ability to cross-reference the scanned images against the digital data files to determine which of these best represents the data set from the various missions, or to see how complete the data sets are. In this presentation we compared data files and scanned images from the Nimbus-2 High-Resolution Infrared Radiometer (HRIR) for September 1966 to determine whether the data and images are properly archived with correct metadata.

  5. NASA SNPP SIPS - Following in the Path of EOS

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Hall, Alfreda; Ho, Evelyn

    2016-01-01

    NASA's Earth Science Data Information System (ESDIS) Project has been operating NASA's Suomi National Polar-Orbiting Partnership (SNPP) Science Data Segment (SDS) since the launch in October 2011. At launch, the SDS focused primarily on the evaluation of Sensor Data Records (SDRs) and Environmental Data Records (EDRs) produced by the Joint Polar Satellite System (JPSS), a National Oceanic and Atmosphere Administration (NOAA) Program, as to their suitability for Earth system science. During the summer of 2014, NASA transitioned to the production of standard Earth Observing System (EOS)-like science products for all instruments aboard Suomi NPP. The five Science Investigator-led Processing Systems (SIPS): Land, Ocean, Atmosphere, Ozone, and Sounder were established to produce the NASA SNPP standard Level 1, Level 2, and global Level 3 products developed by the SNPP Science Teams and to provide the products to NASA's Distributed Active Archive Centers (DAACs) for archive and distribution to the user community. The processing, archiving and distribution of data from NASA's Clouds and the Earth's Radiant Energy System (CERES) and Ozone Mapper/Profiler Suite (OMPS) Limb instruments will continue. With the implementation of the JPSS Block 2 architecture and the launch of JPSS-1, the SDS will receive SNPP data in near real-time via the JPSS Stored Mission Data Hub (JSH), as well as JPSS-1 and future JPSS-2 data. The SNPP SIPS will ingest EOS compatible Level 0 data from the EOS Data Operations System (EDOS) element for their data processing, enabling the continuous EOS-SNPP-JPSS Satellite Data Record.

  6. Archived Data User Service self evaluation report : FAST

    DOT National Transportation Integrated Search

    2000-11-01

    The Archived Data User Service (ADUS) is a recent addition to the National Intelligent Transportation System (ITS) Architecture. This user service required ITS system to have the capability to receive, collect and archive ITS-generated operational...

  7. The Archival Appraisal of Photographs: A RAMP Study with Guidelines.

    ERIC Educational Resources Information Center

    Leary, William H.

    Prepared for Unesco's Records and Archives Management Programme (RAMP), this study is designed to provide archivists, manuscript and museum curators, and other interested information professionals in both industrialized and developing countries with an understanding of the archival character of photographs, and a set of guidelines for the…

  8. Finding and Addressing the Gaps: Two Evaluations of Archival Reference Services

    ERIC Educational Resources Information Center

    Battley, Belinda; Wright, Alicia

    2012-01-01

    Regular evaluation of archival reference services is essential to ensure that users have appropriate access to the information they need. Archives New Zealand has been measuring customer satisfaction for many years using self-completion questionnaires but recently trialed two new methods of evaluation, using external research companies. One…

  9. Archive, Access, and Supply of Scientifically Derived Data: A Data Model for Multi-Parameterized Querying Where Spectral Data Base Meets GIS-Based Mapping Archive

    NASA Astrophysics Data System (ADS)

    Nass, A.; D'Amore, M.; Helbert, J.

    2018-04-01

    An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.

  10. 36 CFR 1200.7 - What are NARA logos and how are they used?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Originals; ER11MY04.002 (4) Electronic Records Archives; ER11MY04.003 (5) The Archival Research Catalog; ER11MY04.004 (6) The Archives Library Information Center; ER11MY04.005 (7) Presidential Libraries; ER11MY04...

  11. 36 CFR 1200.7 - What are NARA logos and how are they used?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Originals; ER11MY04.002 (4) Electronic Records Archives; ER11MY04.003 (5) The Archival Research Catalog; ER11MY04.004 (6) The Archives Library Information Center; ER11MY04.005 (7) Presidential Libraries; ER11MY04...

  12. Archiving Mars Mission Data Sets with the Planetary Data System

    NASA Technical Reports Server (NTRS)

    Guinness, Edward A.

    2006-01-01

    This viewgraph presentation reviews the use of the Planetary Data System (PDS) to archive the datasets that are received from the Mars Missions. It reviews the lessons learned in the actual archiving process, and presents an overview of the actual archiving process. It also reviews the lessons learned from the perspectives of the projects, the data producers and the data users.

  13. 76 FR 41826 - State, Local, Tribal, and Private Sector Policy Advisory Committee (SLTPS-PAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Information Security Oversight Office State, Local, Tribal, and Private Sector Policy Advisory Committee (SLTPS-PAC) AGENCY: National Archives and Records... Information Program for State, Local, Tribal, and Private Sector Entities. DATES: The meeting will be held on...

  14. Toward a National Computerized Database for Moving Image Materials.

    ERIC Educational Resources Information Center

    Gartenberg, Jon

    This report summarizes a project conducted by a group of catalogers from film archives devoted to nitrate preservation, which explored ways of developing a database to provide a complete film and television information service that would be available nationwide and could contain filmographic data, information on holdings in archives and…

  15. NASA Remote Sensing Data in Earth Sciences: Processing, Archiving, Distribution, Applications at the GES DISC

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.

    2005-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.

  16. The archiving of meteor research information

    NASA Technical Reports Server (NTRS)

    Nechitailenko, V. A.

    1987-01-01

    The results obtained over the past years under GLOBMET are not reviewed but some of the problems the solution of which will guide further development of meteor investigation and international cooperation in this field for the near term are discussed. The main attention is paid to problems which the meteor community itself can solve, or at least expedite. Most of them are more or less connected with the problem of information archiving. Information archiving deals with methods and techniques of solving two closely connected groups of problems. The first is the analysis of data and information as an integral part of meteor research and deals with the solution of certain methodological problems. The second deals with gathering data and information for the designing of models of the atmosphere and/or meteor complex and its utilization. These problem solutions are discussed.

  17. Fingerprint verification on medical image reporting system.

    PubMed

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties.

  18. An Intelligent Archive Testbed Incorporating Data Mining

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H.; Isaac, D.; Yang, W.; Bonnlander, B.; Danks, D.

    2009-01-01

    Many significant advances have occurred during the last two decades in remote sensing instrumentation, computation, storage, and communication technology. A series of Earth observing satellites have been launched by U.S. and international agencies and have been operating and collecting global data on a regular basis. These advances have created a data rich environment for scientific research and applications. NASA s Earth Observing System (EOS) Data and Information System (EOSDIS) has been operational since August 1994 with support for pre-EOS data. Currently, EOSDIS supports all the EOS missions including Terra (1999), Aqua (2002), ICESat (2002) and Aura (2004). EOSDIS has been effectively capturing, processing and archiving several terabytes of standard data products each day. It has also been distributing these data products at a rate of several terabytes per day to a diverse and globally distributed user community (Ramapriyan et al. 2009). There are other NASA-sponsored data system activities including measurement-based systems such as the Ocean Data Processing System and the Precipitation Processing system, and several projects under the Research, Education and Applications Solutions Network (REASoN), Making Earth Science Data Records for Use in Research Environments (MEaSUREs), and the Advancing Collaborative Connections for Earth-Sun System Science (ACCESS) programs. Together, these activities provide a rich set of resources constituting a value chain for users to obtain data at various levels ranging from raw radiances to interdisciplinary model outputs. The result has been a significant leap in our understanding of the Earth systems that all humans depend on for their enjoyment, livelihood, and survival. The trend in the community today is towards many distributed sets of providers of data and services. Despite this, visions for the future include users being able to locate, fuse and utilize data with location transparency and high degree of interoperability, and being able to convert data to information and usable knowledge in an efficient, convenient manner, aided significantly by automation (Ramapriyan et al. 2004; NASA 2005). We can look upon the distributed provider environment with capabilities to convert data to information and to knowledge as an Intelligent Archive in the Context of a Knowledge Building system (IA-KBS). Some of the key capabilities of an IA-KBS are: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting (Ramapriyan et al. 2004).

  19. The Environmental Data Initiative: A broad-use data repository for environmental and ecological data that strives to balance data quality and ease of submission

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.

  20. Asan medical information system for healthcare quality improvement.

    PubMed

    Ryu, Hyeon Jeong; Kim, Woo Sung; Lee, Jae Ho; Min, Sung Woo; Kim, Sun Ja; Lee, Yong Su; Lee, Young Ha; Nam, Sang Woo; Eo, Gi Seung; Seo, Sook Gyoung; Nam, Mi Hyun

    2010-09-01

    This purpose of this paper is to introduce the status of the Asan Medical Center (AMC) medical information system with respect to healthcare quality improvement. Asan Medical Information System (AMIS) is projected to become a completely electronic and digital information hospital. AMIS has played a role in improving the health care quality based on the following measures: safety, effectiveness, patient-centeredness, timeliness, efficiency, privacy, and security. AMIS CONSISTED OF SEVERAL DISTINCTIVE SYSTEMS: order communication system, electronic medical record, picture archiving communication system, clinical research information system, data warehouse, enterprise resource planning, IT service management system, and disaster recovery system. The most distinctive features of AMIS were the high alert-medication recognition & management system, the integrated and severity stratified alert system, the integrated patient monitoring system, the perioperative diabetic care monitoring and support system, and the clinical indicator management system. AMIS provides IT services for AMC, 7 affiliated hospitals and over 5,000 partners clinics, and was developed to improve healthcare services. The current challenge of AMIS is standard and interoperability. A global health IT strategy is needed to get through the current challenges and to provide new services as needed.

  1. Archive of Digital Chirp Subbottom Profile Data Collected During USGS Cruise 14BIM05 Offshore of Breton Island, Louisiana, August 2014

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.

    2016-03-29

    The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.

  2. The Role of Data Archives in Synoptic Solar Physics

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin

    The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.

  3. Information Facilities. First Year Report on Contract NIRT 1, Assistance in Telecommunications Planning. Volume 4, Communication Satellite Planning Center, Technical Report No. 2.

    ERIC Educational Resources Information Center

    Lusignan, Bruce B.

    In response to a request by the National Iranian Radio and Television (NIRT), a study was conducted to see what types of film and video archives presently exist. Archives can be classified into three types: (1) production archives used as resources in the making of other films and tapes, (2) film studio archives used to store prints for future…

  4. The Cortex project A quasi-real-time information system to build control systems for high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Barillere, R.; Cabel, H.; Chan, B.; Goulas, I.; Le Goff, J. M.; Vinot, L.; Willmott, C.; Milcent, H.; Huuskonen, P.

    1994-12-01

    The Cortex control information system framework is being developed at CERN. It offers basic functions to allow the sharing of information, control and analysis functions; it presents a uniform human interface for such information and functions; it permits upgrades and additions without code modification and it is sufficiently generic to allow its use by most of the existing or future control systems at CERN. Services will include standard interfaces to user-supplied functions, analysis, archive and event management. Cortex does not attempt to carry out the direct data acquisition or control of the devices; these are activities which are highly specific to the application and are best done by commercial systems or user-written programs. Instead, Cortex integrates these application-specific pieces and supports them by supplying other commonly needed facilities such as collaboration, analysis, diagnosis and user assistance.

  5. In Brief: Online database for instantaneous streamflow data

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2007-11-01

    Access to U.S. Geological Survey (USGS) historical instantaneous streamflow discharge data, dating from around 1990, is now available online through the Instantaneous Data Archive (IDA), the USGS announced on 14 November. In this new system, users can find streamflow information reported at the time intervals at which it is collected, typically 15-minute to hourly intervals. Although instantaneous data have been available for many years, they were not accessible through the Internet. Robert Hirsch, USGS Associate Director of Water, said, ``A user-friendly archive of historical instantaneous streamflow data is important to many different users for such things as floodplain mapping, flood modeling, and estimating pollutant transport.''The site currently has about 1.5 billion instantaneous data values from 5500 stream gages in 26 states. The number of states and stream gages with data will continue to increase, according to the USGS. For more information, visit the Web site: http://ida.water.usgs.gov/ida/.

  6. Cassini Archive Tracking System

    NASA Technical Reports Server (NTRS)

    Conner, Diane; Sayfi, Elias; Tinio, Adrian

    2006-01-01

    The Cassini Archive Tracking System (CATS) is a computer program that enables tracking of scientific data transfers from originators to the Planetary Data System (PDS) archives. Without CATS, there is no systematic means of locating products in the archive process or ensuring their completeness. By keeping a database of transfer communications and status, CATS enables the Cassini Project and the PDS to efficiently and accurately report on archive status. More importantly, problem areas are easily identified through customized reports that can be generated on the fly from any Web-enabled computer. A Web-browser interface and clearly defined authorization scheme provide safe distributed access to the system, where users can perform functions such as create customized reports, record a transfer, and respond to a transfer. CATS ensures that Cassini provides complete science archives to the PDS on schedule and that those archives are available to the science community by the PDS. The three-tier architecture is loosely coupled and designed for simple adaptation to multimission use. Written in the Java programming language, it is portable and can be run on any Java-enabled Web server.

  7. How to Get Data from NOAA Environmental Satellites: An Overview of Operations, Products, Access and Archive

    NASA Astrophysics Data System (ADS)

    Donoho, N.; Graumann, A.; McNamara, D. P.

    2015-12-01

    In this presentation we will highlight access and availability of NOAA satellite data for near real time (NRT) and retrospective product users. The presentation includes an overview of the current fleet of NOAA satellites and methods of data distribution and access to hundreds of imagery and products offered by the Environmental Satellite Processing Center (ESPC) and the Comprehensive Large Array-data Stewardship System (CLASS). In particular, emphasis on the various levels of services for current and past observations will be presented. The National Environmental Satellite, Data, and Information Service (NESDIS) is dedicated to providing timely access to global environmental data from satellites and other sources. In special cases, users are authorized direct access to NESDIS data distribution systems for environmental satellite data and products. Other means of access include publicly available distribution services such as the Global Telecommunication System (GTS), NOAA satellite direct broadcast services and various NOAA websites and ftp servers, including CLASS. CLASS is NOAA's information technology system designed to support long-term, secure preservation and standards-based access to environmental data collections and information. The National Centers for Environmental Information (NCEI) is responsible for the ingest, quality control, stewardship, archival and access to data and science information. This work will also show the latest technology improvements, enterprise approach and future plans for distribution of exponentially increasing data volumes from future NOAA missions. A primer on access to NOAA operational satellite products and services is available at http://www.ospo.noaa.gov/Organization/About/access.html. Access to post-operational satellite data and assorted products is available at http://www.class.noaa.gov

  8. The Protein Data Bank

    PubMed Central

    Berman, Helen M.; Westbrook, John; Feng, Zukang; Gilliland, Gary; Bhat, T. N.; Weissig, Helge; Shindyalov, Ilya N.; Bourne, Philip E.

    2000-01-01

    The Protein Data Bank (PDB; http://www.rcsb.org/pdb/ ) is the single worldwide archive of structural data of biological macromolecules. This paper describes the goals of the PDB, the systems in place for data deposition and access, how to obtain further information, and near-term plans for the future development of the resource. PMID:10592235

  9. A Description of Sexual Offending Committed by Canadian Teachers

    ERIC Educational Resources Information Center

    Moulden, Heather M.; Firestone, Philip; Kingston, Drew A.; Wexler, Audrey F.

    2010-01-01

    The aim of this investigation was to describe teachers who sexually offend against youth and the circumstances related to these offenses. Archival Violent Crime Linkage Analysis System reports were obtained from the Royal Canadian Mounted Police, and demographic and criminal characteristics for the offender, as well as information about the victim…

  10. Archival Theory and the Shaping of Educational History: Utilizing New Sources and Reinterpreting Traditional Ones

    ERIC Educational Resources Information Center

    Glotzer, Richard

    2013-01-01

    Information technology has spawned new evidentiary sources, better retrieval systems for existing ones, and new tools for interpreting traditional source materials. These advances have contributed to a broadening of public participation in civil society (Blouin and Rosenberg 2006). In these culturally unsettled and economically fragile times…

  11. 33 CFR 164.03 - Incorporation by reference.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... inspection at the Navigation Systems Division (CG-5413), Coast Guard Headquarters, 2100 2nd St. SW., Stop... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov... Testing Fiber Ropes 164.74 Cordage Institute, 350 Lincoln Street, Hingham, MA 02043 CIA-3, Standard Test...

  12. Development, Implementation, and Analysis of an Environmental Simulation Information Reference Library and Archive (ESIRLA)

    DTIC Science & Technology

    1997-12-01

    of the DoD environmental science community to identify cloud modeling and other environmental capabilities that support or could potentially support...benefit of the DoD environmental science community. STC determined the detailed requirements for weather effects products and decision aids for specific Air Force operational electro-optical systems.

  13. Guided Resource Inquiries: Integrating Archives into Course Learning and Information Literacy Objectives

    ERIC Educational Resources Information Center

    Jarosz, Ellen E.; Kutay, Stephen

    2017-01-01

    At California State University, Northridge (CSUN), many students lack the skills needed to locate, analyze, and apply essential contexts associated with primary sources. Using these sources requires critical inquiry, which is a fundamental theme in pedagogy, the California State University system's Core Competencies, and the Association of College…

  14. Improve wildlife species tracking—Implementing an enhanced global positioning system data management system for California condors

    USGS Publications Warehouse

    Waltermire, Robert G.; Emmerich, Christopher U.; Mendenhall, Laura C.; Bohrer, Gil; Weinzierl, Rolf P.; McGann, Andrew J.; Lineback, Pat K.; Kern, Tim J.; Douglas, David C.

    2016-05-03

    U.S. Fish and Wildlife Service (USFWS) staff in the Pacific Southwest Region and at the Hopper Mountain National Wildlife Refuge Complex requested technical assistance to improve their global positioning system (GPS) data acquisition, management, and archive in support of the California Condor Recovery Program. The USFWS deployed and maintained GPS units on individual Gymnogyps californianus (California condor) in support of long-term research and daily operational monitoring and management of California condors. The U.S. Geological Survey (USGS) obtained funding through the Science Support Program to provide coordination among project participants, provide GPS Global System for Mobile Communication (GSM) transmitters for testing, and compare GSM/GPS with existing Argos satellite GPS technology. The USFWS staff worked with private companies to design, develop, and fit condors with GSM/GPS transmitters. The Movebank organization, an online database of animal tracking data, coordinated with each of these companies to automatically stream their GPS data into Movebank servers and coordinated with USFWS to improve Movebank software for managing transmitter data, including proofing/error checking of incoming GPS data. The USGS arranged to pull raw GPS data from Movebank into the USGS California Condor Management and Analysis Portal (CCMAP) (https://my.usgs.gov/ccmap) for production and dissemination of a daily map of condor movements including various automated alerts. Further, the USGS developed an automatic archiving system for pulling raw and proofed Movebank data into USGS ScienceBase to comply with the Federal Information Security Management Act of 2002. This improved data management system requires minimal manual intervention resulting in more efficient data flow from GPS data capture to archive status. As a result of the project’s success, Pinnacles National Park and the Ventana Wildlife Society California condor programs became partners and adopted the same workflow, tracking, and data archive system. This GPS tracking data management model and workflow should be applicable and beneficial to other wildlife tracking programs.

  15. Building Community Around Hydrologic Data Models Within CUAHSI

    NASA Astrophysics Data System (ADS)

    Maidment, D.

    2007-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc (CUAHSI) has a Hydrologic Information Systems project which aims to provide better data access and capacity for data synthesis for the nation's water information, both that collected by academic investigators and that collected by water agencies. These data include observations of streamflow, water quality, groundwater levels, weather and climate and aquatic biology. Each water agency or research investigator has a unique method of formatting their data (syntactic heterogeneity) and describing their variables (semantic heterogeneity). The result is a large agglomeration of data in many formats and descriptions whose full content is hard to interpret and analyze. CUAHSI is helping to resolve syntactic heterogeneity through the development of WaterML, a standard XML markup language for communicating water observations data through web services, and a standard relational database structure for archiving data called the Observations Data Model. Variables in these data archiving and communicating systems are indexed against a controlled vocabulary of descriptive terms to provide the capacity to synthesize common data types from disparate data sources.

  16. NASA's EOSDIS: options for data providers

    NASA Astrophysics Data System (ADS)

    Khalsa, Siri J.; Ujhazy, John E.

    1995-12-01

    EOSDIS, the data and information system being developed by NASA to support interdisciplinary earth science research into the 21st century, will do more than manage and distribute data from EOS-era satellites. It will also promote the exchange of data, tools, and research results across disciplinary, agency, and national boundaries. This paper describes the options that data providers will have for interacting with the EOSDIS Core System (ECS), the infrastructure of EOSDIS. The options include: using the ECS advertising service to announce the availability of data at the provider's site; submitting a candidate data set to one of the Distributed Active Archive Centers (DAACs); establishing a data server that will make the data accessible via ECS and establishing Local Information Manager (LIM) which would make the data available for multi-site searches. One additional option is through custom gateway interfaces which would provide access to existing data archives. The gateway, data server, and LIM options require the implementation of ECS code at the provider site to insure proper protocols. The advertisement and ingest options require no part of ECS design to reside at the provider site.

  17. MRMS Experimental Testbed for Operational Products (METOP)

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2016-12-01

    Accurate high-resolution quantitative precipitation estimation (QPE) at the continental scale is of critical importance to the nation's weather, water and climate services. To address this need, a Multi-Radar Multi-Sensor (MRMS) system was developed at the National Severe Storms Lab of National Oceanic and Atmospheric Administration that integrates radar, gauge, model and satellite data and provides a suite of QPE products at 1-km and 2-min resolution. MRMS system consists of three components: 1) an operational system; 2) a real-time research system; 3) an archive testbed. The operational system currently provides instantaneous precipitation rate, type and 1- to 72-hr accumulations for conterminous United Stated and southern Canada. The research system has the similar hardware infrastructure and data environment as the operational system, but runs newer and more advanced algorithms. The newer algorithms are tested on the research system for robustness and computational efficiency in a pseudo operational environment before they are transitioned into operations. The archive testbed, also called the MRMS Experimental Testbed for Operational Products (METOP), consists of a large database that encompasses a wide range of hydroclimatological and geographical regimes. METOP is for the testing and refinements of the most advanced radar QPE techniques, which are often developed on specific data from limited times and locations. The archive data includes quality controlled in-situ observations for the validation of the new radar QPE across all seasons and geographic regions. A number of operational QPE products derived from different sensors/models are also included in METOP for the fusion of multiple sources of complementary precipitation information. This paper is an introduction of the METOP system.

  18. Phase fluctuation spectra: New radio science information to become available in the DSN tracking system Mark III-77

    NASA Technical Reports Server (NTRS)

    Berman, A. L.

    1977-01-01

    An algorithm was developed for the continuous and automatic computation of Doppler noise concurrently at four sample rate intervals, evenly spanning three orders of magnitude. Average temporal Doppler phase fluctuation spectra will be routinely available in the DSN tracking system Mark III-77 and require little additional processing. The basic (noise) data will be extracted from the archival tracking data file (ATDF) of the tracking data management system.

  19. Archive of digital chirp subbottom profile data collected during USGS Cruise 13GFP01, Brownlee Dam and Hells Canyon Reservoir, Idaho and Oregon, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.

    2014-01-01

    From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.

  20. Archive of digital chirp subbottom profile data collected during USGS cruise 11BIM01 Offshore of the Chandeleur Islands, Louisiana, June 2011

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Miselis, Jennifer L.; Flocks, James G.; Wiese, Dana S.

    2013-01-01

    From June 3 to 13, 2011, the U.S. Geological Survey conducted a geophysical survey to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, LA. This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided.

  1. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  2. [The Hospital Information System of the Brazilian National Unified Health System: a preliminary evaluation of performance in monitoring RhD hemolytic disease of the newborn].

    PubMed

    Lobato, Gustavo; Reichenheim, Michael Eduardo; Coeli, Claudia Medina

    2008-03-01

    This study aimed to evaluate the adequacy of the Hospital Information System of the National Unified Health System (SIH-SUS) in identifying cases of RhD hemolytic disease of the newborn (HDN) at the Fernandes Figueira Institute (IFF/FIOCRUZ) from 1998 to 2003. Neonatal records, data from the Medical Archives, and AIH (Hospital Admissions Authorization Form) data consolidated in the SIH-SUS were analyzed. Cases were identified according to the following fields: principal diagnosis, secondary diagnosis, and procedure performed. During the period studied, 194 cases of HDN were diagnosed. The Medical Archives registered 148 newborns with HDN, however only 147 AIHs were issued and 145 consolidated in the SIH-SUS. Among these 145 cases, 84 cited HDN as the principal diagnosis, while secondary diagnosis identified 38 additional cases and the procedures performed failed to identify any further cases. Thus, the SIH-SUS identified only 122 (62.9%) of the 194 cases of HDN treated at the IFF/FIOCRUZ. Although it is necessary to evaluate other units, the SIH-SUS does not appear to be reliable for monitoring HDN. Additional studies are essential for employing secondary administrative data in the context of epidemiological surveillance.

  3. The Hydrologic Cycle Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Hardin, Danny M.; Goodman, H. Michael

    1995-01-01

    The Marshall Space Flight Center Distributed Active Archive Center in Huntsville, Alabama supports the acquisition, production, archival and dissemination of data relevant to the study of the global hydrologic cycle. This paper describes the Hydrologic Cycle DAAC, surveys its principle data holdings, addresses future growth, and gives information for accessing the data sets.

  4. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.

    PubMed

    Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred

    2013-12-01

    The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.

  5. The Planetary Archive

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César

    2014-11-01

    We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.

  6. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  7. EOS Data Products Latency and Reprocessing Evaluation

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.; Wanchoo, L.

    2012-12-01

    NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) program has been processing, archiving, and distributing EOS data since the launch of Terra platform in 1999. The EOSDIS Distributed Active Archive Centers (DAACs) and Science-Investigator-led Processing Systems (SIPSs) are generating over 5000 unique products with a daily average volume of 1.7 Petabytes. Initially EOSDIS had requirements to make process data products within 24 hours of receiving all inputs needed for generating them. Thus, generally, the latency would be slightly over 24 and 48 hours after satellite data acquisition, respectively, for Level 1 and Level 2 products. Due to budgetary constraints these requirements were relaxed, with the requirement being to avoid a growing backlog of unprocessed data. However, the data providers have been generating these products in as timely a manner as possible. The reduction in costs of computing hardware has helped considerably. It is of interest to analyze the actual latencies achieved over the past several years in processing and inserting the data products into the EOSDIS archives for the users to support various scientific studies such as land processes, oceanography, hydrology, atmospheric science, cryospheric science, etc. The instrument science teams have continuously evaluated the data products since the launches of EOS satellites and improved the science algorithms to provide high quality products. Data providers have periodically reprocessed the previously acquired data with these improved algorithms. The reprocessing campaigns run for an extended time period in parallel with forward processing, since all data starting from the beginning of the mission need to be reprocessed. Each reprocessing activity involves more data than the previous reprocessing. The historical record of the reprocessing times would be of interest to future missions, especially those involving large volumes of data and/or computational loads due to complexity of algorithms. Evaluation of latency and reprocessing times requires some of the product metadata information, such as the beginning and ending time of data acquisition, processing date, and version number. This information for each product is made available by data providers to the ESDIS Metrics System (EMS). The EMS replaced the earlier ESDIS Data Gathering and Reporting System (EDGRS) in FY2005. Since then it has collected information about data products' ingest, archive, and distribution. The analysis of latencies and reprocessing times will provide an insight to the data provider process and identify potential areas of weakness in providing timely data to the user community. Delays may be caused by events such as system unavailability, disk failures, delay in level 0 data delivery, availability of input data, network problems, and power failures. Analysis of metrics will highlight areas for focused examination of root causes for delays. The purposes of this study are to: 1) perform a detailed analysis of latency of selected instrument products for last 6 years; 2) analyze the reprocessed data from various data providers to determine the times taken for reprocessing campaigns; 3) identify potential reasons for any anomalies in these metrics.

  8. Oral History and American Advertising: How the "Pepsi Generation" Came Alive.

    ERIC Educational Resources Information Center

    Dreyfus, Carol; Connors, Thomas

    1985-01-01

    Described is a project in which the Archives Center of the National Museum of American History and the George Meany Memorial Archives analyzed a collection of advertising materials of the Pepsi-Cola USA company and conducted interviews to gather historically valuable information concerning the company. Valuable social history information was…

  9. Student Press in American Archives, Fall/Winter 1973-74.

    ERIC Educational Resources Information Center

    National Council of Coll. Publications Advisers.

    This issue of the "Student Press in America Archives List" contains 100 entries on current issues and information, as well as cases involving student press editors, advisers, student media, and the generic subject of the campus press, emphasizing censorship practices and principles. Information concerning how and where to obtain documents of…

  10. 77 FR 20104 - Privacy Act of 1974, as Amended; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... Archives and Records Administration regulations. System Manager and Address: Director, Enforcement and... retained in accordance with the OCC's records management policies and National Archives and Records... accordance with the OCC's records management policies and National Archives and Records Administration...

  11. Image standards in tissue-based diagnosis (diagnostic surgical pathology).

    PubMed

    Kayser, Klaus; Görtler, Jürgen; Goldmann, Torsten; Vollmer, Ekkehard; Hufnagl, Peter; Kayser, Gian

    2008-04-18

    Progress in automated image analysis, virtual microscopy, hospital information systems, and interdisciplinary data exchange require image standards to be applied in tissue-based diagnosis. To describe the theoretical background, practical experiences and comparable solutions in other medical fields to promote image standards applicable for diagnostic pathology. THEORY AND EXPERIENCES: Images used in tissue-based diagnosis present with pathology-specific characteristics. It seems appropriate to discuss their characteristics and potential standardization in relation to the levels of hierarchy in which they appear. All levels can be divided into legal, medical, and technological properties. Standards applied to the first level include regulations or aims to be fulfilled. In legal properties, they have to regulate features of privacy, image documentation, transmission, and presentation; in medical properties, features of disease-image combination, human-diagnostics, automated information extraction, archive retrieval and access; and in technological properties features of image acquisition, display, formats, transfer speed, safety, and system dynamics. The next lower second level has to implement the prescriptions of the upper one, i.e. describe how they are implemented. Legal aspects should demand secure encryption for privacy of all patient related data, image archives that include all images used for diagnostics for a period of 10 years at minimum, accurate annotations of dates and viewing, and precise hardware and software information. Medical aspects should demand standardized patients' files such as DICOM 3 or HL 7 including history and previous examinations, information of image display hardware and software, of image resolution and fields of view, of relation between sizes of biological objects and image sizes, and of access to archives and retrieval. Technological aspects should deal with image acquisition systems (resolution, colour temperature, focus, brightness, and quality evaluation procedures), display resolution data, implemented image formats, storage, cycle frequency, backup procedures, operation system, and external system accessibility. The lowest third level describes the permitted limits and threshold in detail. At present, an applicable standard including all mentioned features does not exist to our knowledge; some aspects can be taken from radiological standards (PACS, DICOM 3); others require specific solutions or are not covered yet. The progress in virtual microscopy and application of artificial intelligence (AI) in tissue-based diagnosis demands fast preparation and implementation of an internationally acceptable standard. The described hierarchic order as well as analytic investigation in all potentially necessary aspects and details offers an appropriate tool to specifically determine standardized requirements.

  12. Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.

    PubMed

    Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray

    2003-07-01

    The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.

  13. Ancillary Data Services of NASA's Planetary Data System

    NASA Technical Reports Server (NTRS)

    Acton, C.

    1994-01-01

    JPL's Navigation and Ancillary Information Facility (NAIF) has primary responsibility for design and implementation of the SPICE ancillary information system, supporting a wide range of space science mission design, observation planning and data analysis functions/activities. NAIF also serves as the geometry and ancillary data node of the Planetary Data System (PDS). As part of the PDS, NAIF archives SPICE and other ancillary data produced by flight projects. NAIF then distributes these data, and associated data access software and high-level tools, to researchers funded by NASA's Office of Space Science. Support for a broader user community is also offered to the extent resources permit. This paper describes the SPICE system and customer support offered by NAIF.

  14. BOREAS AFM-6 Boundary Layer Height Data

    NASA Technical Reports Server (NTRS)

    Wilczak, James; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Airborne Fluxes and Meteorology (AFM)-6 team from National Oceanic and Atmospheric Adminsitration/Environment Technology Laboratory (NOAA/ETL) operated a 915-MHz wind/Radio Acoustic Sounding System (RASS) profiler system in the Southern Study Area (SSA) near the Old Jack Pine (OJP) site. This data set provides boundary layer height information over the site. The data were collected from 21 May 1994 to 20 Sep 1994 and are stored in tabular ASCII files. The boundary layer height data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  15. Data Information for Global Change Studies: NASA's Distributed Active Archive Centers and Cooperating Data Centers

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Earth Observing System (EOS) is an integral part of the National Aeronautics and Space Administration's (NASA's) Earth Science Enterprise (ESE). ESE is a long-term global change research program designed to improve our understanding of the Earth's interrelated processes involving the atmosphere, oceans, land surfaces, and polar regions. Data from EOS instruments and other Earth science measurement systems are useful in understanding the causes and processes of global climate change and the consequences of human activities. The EOS Data and Information System (EOSDIS) provides a structure for data management and user services for products derived from EOS satellite instruments and other NASA Earth science data. Within the EOSDIS framework, the Distributed Active Archive Centers (DAACs) have been established to provide expertise in one or more Earth science disciplines. The DAACs and cooperating data centers provide data and information services to support the global change research community. Much of the development of the DAACs has been in anticipation of the enormous amount of data expected from EOS instruments to be launched within the next two decades. Terra, the EOS flagship launched in December 1999, is the first of a series of EOS satellites to carry several instruments with multispectral capabilities. Some data products from these instruments are now available from several of the DAACs. These and other data products can be ordered through the EOS Data Gateway (EDG) and DAAC-specific online ordering systems.

  16. Collecting, Preserving & Sharing Information in Micronesia. Proceedings of the Annual Pacific Islands Association of Libraries and Archives Conference (3rd, Saipan, Northern Mariana Islands, October 13-15, 1993).

    ERIC Educational Resources Information Center

    Pacific Islands Association of Libraries and Archives, Guam.

    Participants from Washington, Hawaii, Majuro, Palau, Guam and other points in the Northern Mariana Islands came together to share information relating to the functions of libraries and archives as information banks and as preservers of the cultural heritage of Micronesia. Papers presented were: (1) "Reading Motivation in the Pacific"…

  17. Architecture and evolution of Goddard Space Flight Center Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Bedet, Jean-Jacques; Bodden, Lee; Rosen, Wayne; Sherman, Mark; Pease, Phil

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been developed to enhance Earth Science research by improved access to remote sensor earth science data. Building and operating an archive, even one of a moderate size (a few Terabytes), is a challenging task. One of the critical components of this system is Unitree, the Hierarchical File Storage Management System. Unitree, selected two years ago as the best available solution, requires constant system administrative support. It is not always suitable as an archive and distribution data center, and has moderate performance. The Data Archive and Distribution System (DADS) software developed to monitor, manage, and automate the ingestion, archive, and distribution functions turned out to be more challenging than anticipated. Having the software and tools is not sufficient to succeed. Human interaction within the system must be fully understood to improve efficiency to improve efficiency and ensure that the right tools are developed. One of the lessons learned is that the operability, reliability, and performance aspects should be thoroughly addressed in the initial design. However, the GSFC DAAC has demonstrated that it is capable of distributing over 40 GB per day. A backup system to archive a second copy of all data ingested is under development. This backup system will be used not only for disaster recovery but will also replace the main archive when it is unavailable during maintenance or hardware replacement. The GSFC DAAC has put a strong emphasis on quality at all level of its organization. A Quality team has also been formed to identify quality issues and to propose improvements. The DAAC has conducted numerous tests to benchmark the performance of the system. These tests proved to be extremely useful in identifying bottlenecks and deficiencies in operational procedures.

  18. The ``One Archive'' for JWST

    NASA Astrophysics Data System (ADS)

    Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.

    2015-09-01

    The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.

  19. Use of film digitizers to assist radiology image management

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Staab, Edward V.

    1996-05-01

    The purpose of this development effort was to evaluate the possibility of using digital technologies to solve image management problems in the Department of Radiology at the University of Florida. The three problem areas investigated were local interpretation of images produced in remote locations, distribution of images to areas outside of radiology, and film handling. In all cases the use of a laser film digitizer interfaced to an existing Picture Archiving and Communication System (PACS) was investigated as a solution to the problem. In each case the volume of studies involved were evaluated to estimate the impact of the solution on the network, archive, and workstations. Communications were stressed in the analysis of the needs for all image transmission. The operational aspects of the solution were examined to determine the needs for training, service, and maintenance. The remote sites requiring local interpretation included were a rural hospital needing coverage for after hours studies, the University of Florida student infirmary, and the emergency room. Distribution of images to the intensive care units was studied to improve image access and patient care. Handling of films originating from remote sites and those requiring urgent reporting were evaluated to improve management functions. The results of our analysis and the decisions that were made based on the analysis are described below. In the cases where systems were installed, a description of the system and its integration into the PACS system is included. For all three problem areas, although we could move images via a digitizer to the archive and a workstation, there was no way to inform the radiologist that a study needed attention. In the case of outside films, the patient did not always have a medical record number that matched one in our Radiology Information Systems (RIS). In order to incorporate all studies for a patient, we needed common locations for orders, reports, and images. RIS orders were generated for each outside study to be interpreted and a medical record number assigned if none existed. All digitized outside films were archived in the PACS archive for later review or comparison use. The request generated by the RIS requesting a diagnostic interpretation was placed at the PACS workstation to alert the radiologists that unread images had arrived and a box was added to the workstation user interface that could be checked by the radiologist to indicate that a report had been dictated. The digitizer system solved several problems, unavailable films in the emergency room, teleradiology, and archiving of outside studies that had been read by University of Florida radiologists. In addition to saving time for outside film management, we now store the studies for comparison purposes, no longer lose emergency room films, generate diagnostic reports on emergency room films in a timely manner (important for billing and reimbursement), and can handle the distributed nature of our business. As changes in health care drive management changes, existing tools can be used in new ways to help make the transition easier. In this case, adding digitizers to an existing PACS network helped solve several image management problems.

  20. Expansion of the On-line Archive "Statistically Downscaled WCRP CMIP3 Climate Projections"

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Das, T.; Duffy, P.; White, K.

    2009-12-01

    Presentation highlights status and plans for a public-access archive of downscaled CMIP3 climate projections. Incorporating climate projection information into long-term evaluations of water and energy resources requires analysts to have access to projections at "basin-relevant" resolution. Such projections would ideally be bias-corrected to account for climate model tendencies to systematically simulate historical conditions different than observed. In 2007, the U.S. Bureau of Reclamation, Santa Clara University and Lawrence Livermore National Laboratory (LLNL) collaborated to develop an archive of 112 bias-corrected and spatially disaggregated (BCSD) CMIP3 temperature and precipitation projections. These projections were generated using 16 CMIP3 models to simulate three emissions pathways (A2, A1b, and B1) from one or more initializations (runs). Projections are specified on a monthly time step from 1950-2099 and at 0.125 degree spatial resolution within the North American Land Data Assimilation System domain (i.e. contiguous U.S., southern Canada and northern Mexico). Archive data are freely accessible at LLNL Green Data Oasis (url). Since being launched, the archive has served over 3500 data requests by nearly 500 users in support of a range of planning, research and educational activities. Archive developers continue to look for ways to improve the archive and respond to user needs. One request has been to serve the intermediate datasets generated during the BCSD procedure, helping users to interpret the relative influences of the bias-correction and spatial disaggregation on the transformed CMIP3 output. This request has been addressed with intermediate datasets now posted at the archive web-site. Another request relates closely to studying hydrologic and ecological impacts under climate change, where users are asking for projected diurnal temperature information (e.g., projected daily minimum and maximum temperature) and daily time step resolution. In response, archive developers are adding content in 2010, teaming with Scripps Institution of Oceanography (through their NOAA-RISA California-Nevada Applications Program and the California Climate Change Center) to apply a new daily downscaling technique to a sub-ensemble of the archive’s CMIP3 projections. The new technique, Bias-Corrected Constructed Analogs, combines the BC part of BCSD with a recently developed technique that preserves the daily sequencing structure of CMIP3 projections (Constructed Analogs, or CA). Such data will more easily serve hydrologic and ecological impacts assessments, and offer an opportunity to evaluate projection uncertainty associated with downscaling technique. Looking ahead to the arrival CMIP5 projections, archive collaborators have plans apply both BCSD and BCCA over the contiguous U.S. consistent with CMIP3 applications above, and also apply BCSD globally at a 0.5 degree spatial resolution. The latter effort involves collaboration with U.S. Army Corps of Engineers (USACE) and Climate Central.

Top