Software for Managing an Archive of Images
NASA Technical Reports Server (NTRS)
Hallai, Charles; Jones, Helene; Callac, Chris
2003-01-01
This is a revised draft by Innovators concerning the report on Software for Managing and Archive of Images.The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by todays standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional film-based camera, along with metadata about each image.
An Image Archive With The ACR/NEMA Message Formats
NASA Astrophysics Data System (ADS)
Seshadri, Sridhar B.; Khalsa, Satjeet; Arenson, Ronald L.; Brikman, Inna; Davey, Michael J.
1988-06-01
An image archive has been designed to manage and store radiologic images received from within the main Hospital and a from a suburban orthopedic clinic. Images are stored on both magnetic as well as optical media. Prior comparison examinations are combined with the current examination to generate a 'viewing folder' that is sent to the display station for primary diagnosis. An 'archive-manager' controls the database managment, periodic optical disk backup and 'viewing-folder' generation. Images are converted into the ACR/NEMA message format before being written to the optical disk. The software design of the 'archive-manager' and its associated modules is presented. Enhancements to the system are discussed.
CD-based image archival and management on a hybrid radiology intranet.
Cox, R D; Henri, C J; Bret, P M
1997-08-01
This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.
Records & Information Management Services | Alaska State Archives
Search Search in: Archives State of Alaska Home About Records Management (RIMS) For Researchers Collections Imaging (IMS) ASHRAB Libraries, Archives, & Museums Archives Records Management (RIMS) Records records and information management for the State of Alaska. Frequently Asked Questions Submit Records
The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.
Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred
2013-12-01
The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.
NASA Technical Reports Server (NTRS)
Andres, Vince; Walter, David; Hallal, Charles; Jones, Helene; Callac, Chris
2004-01-01
The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by today s standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional, film-based camera, along with metadata about each image. After a "shoot," a photographer downloads the images into the database. Members of the PAO can use a Web-based application to search, view and retrieve images, approve images for publication, and view and edit metadata associated with the images. Approved images are archived and cross-referenced with appropriate descriptions and information. Security is provided by allowing administrators to explicitly grant access privileges to personnel to only access components of the system that they need to (i.e., allow only photographers to upload images, only PAO designated employees may approve images).
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.
1998-07-01
The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.
The Convergence of Information Technology, Data, and Management in a Library Imaging Program
ERIC Educational Resources Information Center
France, Fenella G.; Emery, Doug; Toth, Michael B.
2010-01-01
Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…
Smith, E M; Wandtke, J; Robinson, A
1999-05-01
The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.
TCIA: An information resource to enable open science.
Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo
2013-01-01
Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).
Avrin, D E; Andriole, K P; Yin, L; Gould, R G; Arenson, R L
2001-03-01
A hierarchical storage management (HSM) scheme for cost-effective on-line archival of image data using lossy compression is described. This HSM scheme also provides an off-site tape backup mechanism and disaster recovery. The full-resolution image data are viewed originally for primary diagnosis, then losslessly compressed and sent off site to a tape backup archive. In addition, the original data are wavelet lossy compressed (at approximately 25:1 for computed radiography, 10:1 for computed tomography, and 5:1 for magnetic resonance) and stored on a large RAID device for maximum cost-effective, on-line storage and immediate retrieval of images for review and comparison. This HSM scheme provides a solution to 4 problems in image archiving, namely cost-effective on-line storage, disaster recovery of data, off-site tape backup for the legal record, and maximum intermediate storage and retrieval through the use of on-site lossy compression.
The Image and Data Archive at the Laboratory of Neuro Imaging.
Crawford, Karen L; Neu, Scott C; Toga, Arthur W
2016-01-01
The LONI Image and Data Archive (IDA)(1) is a repository for sharing and long-term preservation of neuroimaging and biomedical research data. Originally designed to archive strictly medical image files, the IDA has evolved over the last ten years and now encompasses the storage and dissemination of neuroimaging, clinical, biospecimen, and genetic data. In this article, we report upon the genesis of the IDA and how it currently securely manages data and protects data ownership. Copyright © 2015 Elsevier Inc. All rights reserved.
Mass-storage management for distributed image/video archives
NASA Astrophysics Data System (ADS)
Franchi, Santina; Guarda, Roberto; Prampolini, Franco
1993-04-01
The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.
Design and implementation of GRID-based PACS in a hospital with multiple imaging departments
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Jin, Jin; Sun, Jianyong; Zhang, Jianguo
2008-03-01
Usually, there were multiple clinical departments providing imaging-enabled healthcare services in enterprise healthcare environment, such as radiology, oncology, pathology, and cardiology, the picture archiving and communication system (PACS) is now required to support not only radiology-based image display, workflow and data flow management, but also to have more specific expertise imaging processing and management tools for other departments providing imaging-guided diagnosis and therapy, and there were urgent demand to integrate the multiple PACSs together to provide patient-oriented imaging services for enterprise collaborative healthcare. In this paper, we give the design method and implementation strategy of developing grid-based PACS (Grid-PACS) for a hospital with multiple imaging departments or centers. The Grid-PACS functions as a middleware between the traditional PACS archiving servers and workstations or image viewing clients and provide DICOM image communication and WADO services to the end users. The images can be stored in distributed multiple archiving servers, but can be managed with central mode. The grid-based PACS has auto image backup and disaster recovery services and can provide best image retrieval path to the image requesters based on the optimal algorithms. The designed grid-based PACS has been implemented in Shanghai Huadong Hospital and been running for two years smoothly.
NASA Astrophysics Data System (ADS)
Fletcher, Alex; Yoo, Terry S.
2004-04-01
Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.
NASA Astrophysics Data System (ADS)
Robbins, William L.; Conklin, James J.
1995-10-01
Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.
A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.
Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos
2016-01-01
Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.
[A new concept for integration of image databanks into a comprehensive patient documentation].
Schöll, E; Holm, J; Eggli, S
2001-05-01
Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.
DICOM-compliant PACS with CD-based image archival
NASA Astrophysics Data System (ADS)
Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.
1998-07-01
This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.
Clinical applications of an ATM/Ethernet network in departments of neuroradiology and radiotherapy.
Cimino, C; Pizzi, R; Fusca, M; Bruzzone, M G; Casolino, D; Sicurello, F
1997-01-01
An integrated system for the multimedia management of images and clinical information has been developed at the Isituto Nazionale Neurologico C. Besta in Milan. The Institute physicians have the daily need of consulting images coming from various modalities. The high volume of archived material and the need of retrieving and displaying new and past images and clinical information has motivated the development of a Picture Archiving and Communication System (PACS) for the automatic management of images and clinical data, related not only to the Radiology Department, but also to the Radiotherapy Department for 3D virtual simulation, to remote teleconsulting, and in the following to all the wards, ambulatories and labs.
Detailed description of the Mayo/IBM PACS
NASA Astrophysics Data System (ADS)
Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.
1991-07-01
The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.
OASIS: A Data Fusion System Optimized for Access to Distributed Archives
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Kong, M.; Good, J. C.
2002-05-01
The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.
The PDS-based Data Processing, Archiving and Management Procedures in Chang'e Mission
NASA Astrophysics Data System (ADS)
Zhang, Z. B.; Li, C.; Zhang, H.; Zhang, P.; Chen, W.
2017-12-01
PDS is adopted as standard format of scientific data and foundation of all data-related procedures in Chang'e mission. Unlike the geographically distributed nature of the planetary data system, all procedures of data processing, archiving, management and distribution are proceeded in the headquarter of Ground Research and Application System of Chang'e mission in a centralized manner. The RAW data acquired by the ground stations is transmitted to and processed by data preprocessing subsystem (DPS) for the production of PDS-compliant Level 0 Level 2 data products using established algorithms, with each product file being well described using an attached label, then all products with the same orbit number are put together into a scheduled task for archiving along with a XML archive list file recoding all product files' properties such as file name, file size etc. After receiving the archive request from DPS, data management subsystem (DMS) is provoked to parse the XML list file to validate all the claimed files and their compliance to PDS using a prebuilt data dictionary, then to exact metadata of each data product file from its PDS label and the fields of its normalized filename. Various requirements of data management, retrieving, distribution and application can be well met using the flexible combination of the rich metadata empowered by the PDS. In the forthcoming CE-5 mission, all the design of data structure and procedures will be updated from PDS version 3 used in previous CE-1, CE-2 and CE-3 missions to the new version 4, the main changes would be: 1) a dedicated detached XML label will be used to describe the corresponding scientific data acquired by the 4 instruments carried, the XML parsing framework used in archive list validation will be reused for the label after some necessary adjustments; 2) all the image data acquired by the panorama camera, landing camera and lunar mineralogical spectrometer should use an Array_2D_Image/Array_3D_Image object to store image data, and use a Table_Character object to store image frame header; the tabulated data acquired by the lunar regolith penetrating radar should use a Table_Binary object to store measurements.
MIDG-Emerging grid technologies for multi-site preclinical molecular imaging research communities.
Lee, Jasper; Documet, Jorge; Liu, Brent; Park, Ryan; Tank, Archana; Huang, H K
2011-03-01
Molecular imaging is the visualization and identification of specific molecules in anatomy for insight into metabolic pathways, tissue consistency, and tracing of solute transport mechanisms. This paper presents the Molecular Imaging Data Grid (MIDG) which utilizes emerging grid technologies in preclinical molecular imaging to facilitate data sharing and discovery between preclinical molecular imaging facilities and their collaborating investigator institutions to expedite translational sciences research. Grid-enabled archiving, management, and distribution of animal-model imaging datasets help preclinical investigators to monitor, access and share their imaging data remotely, and promote preclinical imaging facilities to share published imaging datasets as resources for new investigators. The system architecture of the Molecular Imaging Data Grid is described in a four layer diagram. A data model for preclinical molecular imaging datasets is also presented based on imaging modalities currently used in a molecular imaging center. The MIDG system components and connectivity are presented. And finally, the workflow steps for grid-based archiving, management, and retrieval of preclincial molecular imaging data are described. Initial performance tests of the Molecular Imaging Data Grid system have been conducted at the USC IPILab using dedicated VMware servers. System connectivity, evaluated datasets, and preliminary results are presented. The results show the system's feasibility, limitations, direction of future research. Translational and interdisciplinary research in medicine is increasingly interested in cellular and molecular biology activity at the preclinical levels, utilizing molecular imaging methods on animal models. The task of integrated archiving, management, and distribution of these preclinical molecular imaging datasets at preclinical molecular imaging facilities is challenging due to disparate imaging systems and multiple off-site investigators. A Molecular Imaging Data Grid design, implementation, and initial evaluation is presented to demonstrate the secure and novel data grid solution for sharing preclinical molecular imaging data across the wide-area-network (WAN).
Use of film digitizers to assist radiology image management
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Frost, Meryll M.; Staab, Edward V.
1996-05-01
The purpose of this development effort was to evaluate the possibility of using digital technologies to solve image management problems in the Department of Radiology at the University of Florida. The three problem areas investigated were local interpretation of images produced in remote locations, distribution of images to areas outside of radiology, and film handling. In all cases the use of a laser film digitizer interfaced to an existing Picture Archiving and Communication System (PACS) was investigated as a solution to the problem. In each case the volume of studies involved were evaluated to estimate the impact of the solution on the network, archive, and workstations. Communications were stressed in the analysis of the needs for all image transmission. The operational aspects of the solution were examined to determine the needs for training, service, and maintenance. The remote sites requiring local interpretation included were a rural hospital needing coverage for after hours studies, the University of Florida student infirmary, and the emergency room. Distribution of images to the intensive care units was studied to improve image access and patient care. Handling of films originating from remote sites and those requiring urgent reporting were evaluated to improve management functions. The results of our analysis and the decisions that were made based on the analysis are described below. In the cases where systems were installed, a description of the system and its integration into the PACS system is included. For all three problem areas, although we could move images via a digitizer to the archive and a workstation, there was no way to inform the radiologist that a study needed attention. In the case of outside films, the patient did not always have a medical record number that matched one in our Radiology Information Systems (RIS). In order to incorporate all studies for a patient, we needed common locations for orders, reports, and images. RIS orders were generated for each outside study to be interpreted and a medical record number assigned if none existed. All digitized outside films were archived in the PACS archive for later review or comparison use. The request generated by the RIS requesting a diagnostic interpretation was placed at the PACS workstation to alert the radiologists that unread images had arrived and a box was added to the workstation user interface that could be checked by the radiologist to indicate that a report had been dictated. The digitizer system solved several problems, unavailable films in the emergency room, teleradiology, and archiving of outside studies that had been read by University of Florida radiologists. In addition to saving time for outside film management, we now store the studies for comparison purposes, no longer lose emergency room films, generate diagnostic reports on emergency room films in a timely manner (important for billing and reimbursement), and can handle the distributed nature of our business. As changes in health care drive management changes, existing tools can be used in new ways to help make the transition easier. In this case, adding digitizers to an existing PACS network helped solve several image management problems.
A case for automated tape in clinical imaging.
Bookman, G; Baune, D
1998-08-01
Electronic archiving of radiology images over many years will require many terabytes of storage with a need for rapid retrieval of these images. As more large PACS installations are installed and implemented, a data crisis occurs. The ability to store this large amount of data using the traditional method of optical jukeboxes or online disk alone becomes an unworkable solution. The amount of floor space number of optical jukeboxes, and off-line shelf storage required to store the images becomes unmanageable. With the recent advances in tape and tape drives, the use of tape for long term storage of PACS data has become the preferred alternative. A PACS system consisting of a centrally managed system of RAID disk, software and at the heart of the system, tape, presents a solution that for the first time solves the problems of multi-modality high end PACS, non-DICOM image, electronic medical record and ADT data storage. This paper will examine the installation of the University of Utah, Department of Radiology PACS system and the integration of automated tape archive. The tape archive is also capable of storing data other than traditional PACS data. The implementation of an automated data archive to serve the many other needs of a large hospital will also be discussed. This will include the integration of a filmless cardiology department and the backup/archival needs of a traditional MIS department. The need for high bandwidth to tape with a large RAID cache will be examined and how with an interface to a RIS pre-fetch engine, tape can be a superior solution to optical platters or other archival solutions. The data management software will be discussed in detail. The performance and cost of RAID disk cache and automated tape compared to a solution that includes optical will be examined.
Cardio-PACs: a new opportunity
NASA Astrophysics Data System (ADS)
Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary
2000-05-01
It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-10-01
Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.
Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive
NASA Technical Reports Server (NTRS)
Geller, Gary N.
2004-01-01
Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.
Digital information management: a progress report on the National Digital Mammography Archive
NASA Astrophysics Data System (ADS)
Beckerman, Barbara G.; Schnall, Mitchell D.
2002-05-01
Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.
Data grid: a distributed solution to PACS
NASA Astrophysics Data System (ADS)
Zhang, Xiaoyan; Zhang, Jianguo
2004-04-01
In a hospital, various kinds of medical images acquired from different modalities are generally used and stored in different department and each modality usually attaches several workstations to display or process images. To do better diagnosis, radiologists or physicians often need to retrieve other kinds of images for reference. The traditional image storage solution is to buildup a large-scale PACS archive server. However, the disadvantages of pure centralized management of PACS archive server are obvious. Besides high costs, any failure of PACS archive server would cripple the entire PACS operation. Here we present a new approach to develop the storage grid in PACS, which can provide more reliable image storage and more efficient query/retrieval for the whole hospital applications. In this paper, we also give the performance evaluation by comparing the three popular technologies mirror, cluster and grid.
Clinical experience with a high-performance ATM-connected DICOM archive for cardiology
NASA Astrophysics Data System (ADS)
Solomon, Harry P.
1997-05-01
A system to archive large image sets, such as cardiac cine runs, with near realtime response must address several functional and performance issues, including efficient use of a high performance network connection with standard protocols, an architecture which effectively integrates both short- and long-term mass storage devices, and a flexible data management policy which allows optimization of image distribution and retrieval strategies based on modality and site-specific operational use. Clinical experience with such as archive has allowed evaluation of these systems issues and refinement of a traffic model for cardiac angiography.
Multi-provider architecture for cloud outsourcing of medical imaging repositories.
Godinho, Tiago Marques; Bastião Silva, Luís A; Costa, Carlos; Oliveira, José Luís
2014-01-01
Over the last few years, the extended usage of medical imaging procedures has raised the medical community attention towards the optimization of their workflows. More recently, the federation of multiple institutions into a seamless distribution network has brought hope of increased quality healthcare services along with more efficient resource management. As a result, medical institutions are constantly looking for the best infrastructure to deploy their imaging archives. In this scenario, public cloud infrastructures arise as major candidates, as they offer elastic storage space, optimal data availability without great requirements of maintenance costs or IT personnel, in a pay-as-you-go model. However, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. This document proposes a multi-provider architecture for integration of outsourced archives with in-house PACS resources, taking advantage of foreign providers to store medical imaging studies, without disregarding security. It enables the retrieval of images from multiple archives simultaneously, improving performance, data availability and avoiding the vendor-locking problem. Moreover it enables load balancing and cache techniques.
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive
NASA Technical Reports Server (NTRS)
Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.
1995-01-01
Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1994-01-01
Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.
Cities at Night: Citizens science to rescue an archive for the science
NASA Astrophysics Data System (ADS)
Sánchez de Miguel, Alejandro; Gomez Castaño, José; Lombraña, Daniel; Zamorano, Jaime; Gallego, Jesús
2015-08-01
Since 2003, astronauts have been taking photos from the International Space Station. Many of these images have been published on the websites of participating agencies or the Twitter accounts of the astronauts. However, most of the images taken by astronauts have not been published remaining, on archive without being shown to the world. This ISS archive of nighttime images are not being used for conducting scientific projects because of the difficulty of cataloging. The project Citiesatnight have managed to scientificly prepare these images. The main goal of the project is to characterize light pollution in colors, that is fundamental to track the impact of the new LED lighting on the light pollution. However, other science can be benefited from the project as the study of meteors, auroras studies and general knowledge of these images. The current status of the project, methodology and ideas for exploiting the same platform for other projects is presented. The current results of the project are the complete documentation of all high resolution images archive in just one month.Until now, more tha 132.000 images have been catalogues (30.000 of thouse are cities), more than 2800 images have been located, 1000 have been georeferenced. Also several meteors have been detected on non dedicated pictures. More tha 16.000 have been participated.
Imaged Document Optical Correlation and Conversion System (IDOCCS)
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-03-01
Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). In addition, many organizations are converting their paper archives to electronic images, which are stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources. The Imaged Document Optical Correlation and Conversion System (IDOCCS) provides a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval capability of document images. The IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and can even determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo, or documents with a particular individual's signature block, can be singled out. With this dual capability, IDOCCS outperforms systems that rely on optical character recognition as a basis for indexing and storing only the textual content of documents for later retrieval.
Picture archiving and communication system--Part one: Filmless radiology and distance radiology.
De Backer, A I; Mortelé, K J; De Keulenaer, B L
2004-01-01
Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.
A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability
NASA Astrophysics Data System (ADS)
Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis
2007-03-01
During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.
ERIC Educational Resources Information Center
Haapaniemi, Peter
1990-01-01
Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…
Data management and digital delivery of analog data
Miller, W.A.; Longhenry, Ryan; Smith, T.
2008-01-01
The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.
36 CFR § 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
Satellite-Derived Management Zones
NASA Technical Reports Server (NTRS)
Lepoutre, Damien; Layrol, Laurent
2005-01-01
The term "satellite-derived management zones" (SAMZ) denotes agricultural management zones that are subdivisions of large fields and that are derived from images of the fields acquired by instruments aboard Earth-orbiting satellites during approximately the past 15 years. "SAMZ" also denotes the methodology and the software that implements the methodology for creating such zones. The SAMZ approach is one of several products of continuing efforts to realize a concept of precision agriculture, which involves optimal variations in seeding, in application of chemicals, and in irrigation, plus decisions to farm or not to farm certain portions of fields, all in an effort to maximize profitability in view of spatial and temporal variations in the growth and health of crops, and in the chemical and physical conditions of soils. As used here, "management zone" signifies, more precisely, a subdivision of a field within which the crop-production behavior is regarded as homogeneous. From the perspective of precision agriculture, management zones are the smallest subdivisions between which the seeding, application of chemicals, and other management parameters are to be varied. In the SAMZ approach, the main sources of data are the archives of satellite imagery that have been collected over the years for diverse purposes. One of the main advantages afforded by the SAMZ approach is that the data in these archives can be reused for purposes of precision agriculture at low cost. De facto, these archives contain information on all sources of variability within a field, including weather, crop types, crop management, soil types, and water drainage patterns. The SAMZ methodology involves the establishment of a Web-based interface based on an algorithm that generates management zones automatically and quickly from archival satellite image data in response to requests from farmers. A farmer can make a request by either uploading data describing a field boundary to the Web site or else drawing the boundary on a reference image. Hence, a farmer can start to engage in precision farming shortly after gaining access to the Web site, without the need for incurring the high costs of conventional precision-agriculture data-collection practices that include collecting soil samples, mapping electrical conductivity of soil, and compiling multiyear crop-yield data. Given the boundary of a field, a SAMZ server computes the zones within the field in a three-stage process. In the first stage, a vector-valued image of the field is constructed by assembling, from the archives, the equivalent of a stack of the available images of the field (see figure). In the second stage, the vector-valued image is analyzed by use of a wavelet transform that detects spatial variations considered significant for precision farming while suppressing small-scale heterogeneities that are regarded as insignificant. In the third stage, a segmentation algorithm assembles the zones from smaller regions that have been identified in the wavelet analysis.
SAMZ: Satellite-Derived Management Zones
NASA Technical Reports Server (NTRS)
2004-01-01
The term "satellite-derived management zones" (SAMZ) denotes agricultural management zones that are subdivisions of large fields and that are derived from images of the fields acquired by instruments aboard Earth orbiting satellites during approximately the past 15 years. "SAMZ" also denotes the methodology and the software that implements the methodology for creating such zones. The SAMZ approach is one of several products of continuing efforts to realize a concept of precision agriculture, which involves optimal variations in seeding, in application of chemicals, and in irrigation, plus decisions to farm or not to farm certain portions of fields, all in an effort to maximize profitability in view of spatial and temporal variations in the growth and health of crops and in the chemical and physical conditions of soils. As used here, "management zone" signifies, more precisely, a subdivision of a field within which the crop production behavior is regarded as homogeneous. From the perspective of precision agriculture, management zones are the smallest subdivisions between which the seeding, application of chemicals, and other management parameters are to be varied. In the SAMZ approach, the main sources of data are the archives of satellite imagery that have been collected over the years for diverse purposes. One of the main advantages afforded by the SAMZ approach is that the data in these archives can be reused for purposes of precision agriculture at low cost. De facto, these archives contain information on all sources of variability within a field, including weather, crop types, crop management, soil types, and water drainage patterns. The SAMZ methodology involves the establishment of a Web-based interface based on an algorithm that generates management zones automatically and quickly from archival satellite image data in response to requests from farmers. A farmer can make a request by either uploading data describing a field boundary to the Web site or else drawing the boundary on a reference image. Hence, a farmer can start to engage in precision farming shortly after gaining access to the Web site, without need for incurring the high costs of conventional precision-agriculture data-collection practices that include collecting soil samples, mapping electrical conductivity of soil, and compiling multi-year crop-yield data. Given the boundary of a field, a SAMZ server computes the zones within the field in a three-stage process. In the first stage, a vector-valued image of the field is constructed by assembling, from the archives, the equivalent of a stack of the available images of the field (see figure). In the second stage, the vector-valued image is analyzed by use of a wavelet transform that detects spatial variations considered significant for precision farming while suppressing small-scale heterogeneities that are regarded as insignificant. In the third stage, a segmentation algorithm assembles the zones from smaller regions that have been identified in the wavelet analysis.
Hierarchical storage of large volume of multidector CT data using distributed servers
NASA Astrophysics Data System (ADS)
Ratib, Osman; Rosset, Antoine; Heuberger, Joris; Bandon, David
2006-03-01
Multidector scanners and hybrid multimodality scanners have the ability to generate large number of high-resolution images resulting in very large data sets. In most cases, these datasets are generated for the sole purpose of generating secondary processed images and 3D rendered images as well as oblique and curved multiplanar reformatted images. It is therefore not essential to archive the original images after they have been processed. We have developed an architecture of distributed archive servers for temporary storage of large image datasets for 3D rendering and image processing without the need for long term storage in PACS archive. With the relatively low cost of storage devices it is possible to configure these servers to hold several months or even years of data, long enough for allowing subsequent re-processing if required by specific clinical situations. We tested the latest generation of RAID servers provided by Apple computers with a capacity of 5 TBytes. We implemented a peer-to-peer data access software based on our Open-Source image management software called OsiriX, allowing remote workstations to directly access DICOM image files located on the server through a new technology called "bonjour". This architecture offers a seamless integration of multiple servers and workstations without the need for central database or complex workflow management tools. It allows efficient access to image data from multiple workstation for image analysis and visualization without the need for image data transfer. It provides a convenient alternative to centralized PACS architecture while avoiding complex and time-consuming data transfer and storage.
Imaged document information location and extraction using an optical correlator
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-12-01
Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). Many of these organizations are converting their paper archives to electronic images, which are then stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources and provide for rapid access to the information contained within these imaged documents. To meet this need, Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provide a means for the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and has the potential to determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.
Rendering an archive in three dimensions
NASA Astrophysics Data System (ADS)
Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.
2003-05-01
We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.
ERIC Educational Resources Information Center
Fantini, M.; And Others
1990-01-01
Describes the architecture of the prototype of an image management system that has been used to develop an application concerning images of frescoes in the Sistina Chapel in the Vatican. Hardware and software design are described, the use of local area networks (LANs) is discussed, and data organization is explained. (15 references) (LRW)
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wandtke, John; Robinson, Arvin E.
1999-07-01
The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.
Migration of medical image data archived using mini-PACS to full-PACS.
Jung, Haijo; Kim, Hee-Joung; Kang, Won-Suk; Lee, Sang-Ho; Kim, Sae-Rome; Ji, Chang Lyong; Kim, Jung-Han; Yoo, Sun Kook; Kim, Ki-Hwang
2004-06-01
This study evaluated the migration to full-PACS of medical image data archived using mini-PACS at two hospitals of the Yonsei University Medical Center, Seoul, Korea. A major concern in the migration of medical data is to match the image data from the mini-PACS with the hospital OCS (Ordered Communication System). Prior to carrying out the actual migration process, the principles, methods, and anticipated results for the migration with respect to both cost and effectiveness were evaluated. Migration gateway workstations were established and a migration software tool was developed. The actual migration process was performed based on the results of several migration simulations. Our conclusions were that a migration plan should be carefully prepared and tailored to the individual hospital environment because the server system, archive media, network, OCS, and policy for data management may be unique.
Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D
2017-11-01
Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.
Status of the Landsat thematic mapper and multispectral scanner archive conversion system
Werner, Darla J.
1993-01-01
The U.S. Geological Survey's EROS Data Center (EDC) manages the National Satellite Land Remote Sensing Data Archive. This archive includes Landsat thematic mapper (TM) multispectral scanner (MSS) data acquired since 1972. The Landsat archive is an important resource to global change research. To ensure long-term availability of Landsat data from the archive, the EDC specified requirements for a Thematic Mapper and Multispectral Scanner Archive Conversion System (TMACS) that would preserve the data by transcribing it to a more durable medium. In addition to media conversion, hardware and software was installed at EDC in July 1992. In December 1992, the EDC began converting Landsat MSS data from high-density, open reel instrumentation tapes to digital cassette tapes. Almost 320,000 MSS images acquired since 1979 and more than 200,000 TM images acquired since 1982 will be converted to the new medium during the next 3 years. During the media conversion process, several high-density tapes have exhibited severe binder degradation. Even though these tapes have been stored in environmentally controlled conditions, hydrolysis has occurred, resulting in "sticky oxide shed". Using a thermostatically controlled oven built at EDC, tape "baking" has been 100 percent successful and actually improves the quality of some images.
Henri, C J; Cox, R D; Bret, P M
1997-08-01
This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).
Schilling, R B
1993-05-01
Picture archiving and communication systems (PACS) provide image viewing at diagnostic, reporting, consultation, and remote workstations; archival on magnetic or optical media by means of short- or long-term storage devices; communications by means of local or wide area networks or public communication services; and integrated systems with modality interfaces and gateways to health care facilities and departmental information systems. Research indicates three basic needs for image and report management: (a) improved communication and turnaround time between radiologists and other imaging specialists and referring physicians, (b) fast reliable access to both current and previously obtained images and reports, and (c) space-efficient archival support. Although PACS considerations are much more complex than those associated with single modalities, the same basic purchase criteria apply. These criteria include technical leadership, image quality, throughput, life cost (eg, initial cost, maintenance, upgrades, and depreciation), and total service. Because a PACS takes much longer to implement than a single modality, the customer and manufacturer must develop a closer working relationship than has been necessary in the past.
Aircraft scanner data availability via the version 0 Information Management System
NASA Technical Reports Server (NTRS)
Mah, G. R.
1995-01-01
As part of the Earth Observing System Data and Information System (EOSDIS) development, NASA and other government agencies have developed an operational prototype of the Information Management System (IMS). The IMS provides access to the data archived at the Distributed Active Archive Centers (DAAC's) that allows users to search through metadata describing the (image) data. Criteria based on sensor name or type, date and time, and geographic location are used to search the archive. Graphical representations of coverage and browse images are available to further refine a user's selection. previously, the EROS Data Center (EDC) DAAC had identified the Advanced SOlid-state Array Spectrometer (ASAS), Airborne Visible and infrared Imaging Spectrometer (AVIRIS), NS-001, and Thermal Infrared Multispectral Scanner (TIMS) as precursor data sets similar to those the DAAC will handle in the Earth Observing System era. Currently, the EDC DAAC staff, in cooperation with NASA, has transcribed TIMS, NS-001, and Thematic Mapper Simulation (TMS) data from Ames Research Center and also TIMS data from Stennis Space Center. During the transcription process, the IMS metadata and browse images were created to populate the inventory at the EDC DAAC. These data sets are now available in the IMS and may be requested from the any of the DAAC's via the IMS.
Commercial applications for optical data storage
NASA Astrophysics Data System (ADS)
Tas, Jeroen
1991-03-01
Optical data storage has spurred the market for document imaging systems. These systems are increasingly being used to electronically manage the processing, storage and retrieval of documents. Applications range from straightforward archives to sophisticated workflow management systems. The technology is developing rapidly and within a few years optical imaging facilities will be incorporated in most of the office information systems. This paper gives an overview of the status of the market, the applications and the trends of optical imaging systems.
Visual information mining in remote sensing image archives
NASA Astrophysics Data System (ADS)
Pelizzari, Andrea; Descargues, Vincent; Datcu, Mihai P.
2002-01-01
The present article focuses on the development of interactive exploratory tools for visually mining the image content in large remote sensing archives. Two aspects are treated: the iconic visualization of the global information in the archive and the progressive visualization of the image details. The proposed methods are integrated in the Image Information Mining (I2M) system. The images and image structure in the I2M system are indexed based on a probabilistic approach. The resulting links are managed by a relational data base. Both the intrinsic complexity of the observed images and the diversity of user requests result in a great number of associations in the data base. Thus new tools have been designed to visualize, in iconic representation the relationships created during a query or information mining operation: the visualization of the query results positioned on the geographical map, quick-looks gallery, visualization of the measure of goodness of the query, visualization of the image space for statistical evaluation purposes. Additionally the I2M system is enhanced with progressive detail visualization in order to allow better access for operator inspection. I2M is a three-tier Java architecture and is optimized for the Internet.
Data Mining and Knowledge Discovery tools for exploiting big Earth-Observation data
NASA Astrophysics Data System (ADS)
Espinoza Molina, D.; Datcu, M.
2015-04-01
The continuous increase in the size of the archives and in the variety and complexity of Earth-Observation (EO) sensors require new methodologies and tools that allow the end-user to access a large image repository, to extract and to infer knowledge about the patterns hidden in the images, to retrieve dynamically a collection of relevant images, and to support the creation of emerging applications (e.g.: change detection, global monitoring, disaster and risk management, image time series, etc.). In this context, we are concerned with providing a platform for data mining and knowledge discovery content from EO archives. The platform's goal is to implement a communication channel between Payload Ground Segments and the end-user who receives the content of the data coded in an understandable format associated with semantics that is ready for immediate exploitation. It will provide the user with automated tools to explore and understand the content of highly complex images archives. The challenge lies in the extraction of meaningful information and understanding observations of large extended areas, over long periods of time, with a broad variety of EO imaging sensors in synergy with other related measurements and data. The platform is composed of several components such as 1.) ingestion of EO images and related data providing basic features for image analysis, 2.) query engine based on metadata, semantics and image content, 3.) data mining and knowledge discovery tools for supporting the interpretation and understanding of image content, 4.) semantic definition of the image content via machine learning methods. All these components are integrated and supported by a relational database management system, ensuring the integrity and consistency of Terabytes of Earth Observation data.
Benefits of cloud computing for PACS and archiving.
Koch, Patrick
2012-01-01
The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.
NASA Astrophysics Data System (ADS)
Osada, Masakazu; Tsukui, Hideki
2002-09-01
ABSTRACT Picture Archiving and Communication System (PACS) is a system which connects imaging modalities, image archives, and image workstations to reduce film handling cost and improve hospital workflow. Handling diagnostic ultrasound and endoscopy images is challenging, because it produces large amount of data such as motion (cine) images of 30 frames per second, 640 x 480 in resolution, with 24-bit color. Also, it requires enough image quality for clinical review. We have developed PACS which is able to manage ultrasound and endoscopy cine images with above resolution and frame rate, and investigate suitable compression method and compression rate for clinical image review. Results show that clinicians require capability for frame-by-frame forward and backward review of cine images because they carefully look through motion images to find certain color patterns which may appear in one frame. In order to satisfy this quality, we have chosen motion JPEG, installed and confirmed that we could capture this specific pattern. As for acceptable image compression rate, we have performed subjective evaluation. No subjects could tell the difference between original non-compressed images and 1:10 lossy compressed JPEG images. One subject could tell the difference between original and 1:20 lossy compressed JPEG images although it is acceptable. Thus, ratios of 1:10 to 1:20 are acceptable to reduce data amount and cost while maintaining quality for clinical review.
Enterprise-class Digital Imaging and Communications in Medicine (DICOM) image infrastructure.
York, G; Wortmann, J; Atanasiu, R
2001-06-01
Most current picture archiving and communication systems (PACS) are designed for a single department or a single modality. Few PACS installations have been deployed that support the needs of the hospital or the entire Integrated Delivery Network (IDN). The authors propose a new image management architecture that can support a large, distributed enterprise.
Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean
2004-04-01
Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.
Commercial imagery archive product development
NASA Astrophysics Data System (ADS)
Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
Managing an archive of weather satellite images
NASA Technical Reports Server (NTRS)
Seaman, R. L.
1992-01-01
The author's experiences of building and maintaining an archive of hourly weather satellite pictures at NOAO are described. This archive has proven very popular with visiting and staff astronomers - especially on windy days and cloudy nights. Given access to a source of such pictures, a suite of simple shell and IRAF CL scripts can provide a great deal of robust functionality with little effort. These pictures and associated data products such as surface analysis (radar) maps and National Weather Service forecasts are updated hourly at anonymous ftp sites on the Internet, although your local Atsmospheric Sciences Department may prove to be a more reliable source. The raw image formats are unfamiliar to most astronomers, but reading them into IRAF is straightforward. Techniques for performing this format conversion at the host computer level are described which may prove useful for other chores. Pointers are given to sources of data and of software, including a package of example tools. These tools include shell and Perl scripts for downloading pictures, maps, and forecasts, as well as IRAF scripts and host level programs for translating the images into IRAF and GIF formats and for slicing & dicing the resulting images. Hints for displaying the images and for making hardcopies are given.
Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.
Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel
2012-01-01
Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012
Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L
2013-02-12
Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.
Medical image informatics infrastructure design and applications.
Huang, H K; Wong, S T; Pietka, E
1997-01-01
Picture archiving and communication systems (PACS) is a system integration of multimodality images and health information systems designed for improving the operation of a radiology department. As it evolves, PACS becomes a hospital image document management system with a voluminous image and related data file repository. A medical image informatics infrastructure can be designed to take advantage of existing data, providing PACS with add-on value for health care service, research, and education. A medical image informatics infrastructure (MIII) consists of the following components: medical images and associated data (including PACS database), image processing, data/knowledge base management, visualization, graphic user interface, communication networking, and application oriented software. This paper describes these components and their logical connection, and illustrates some applications based on the concept of the MIII.
Documet, Jorge; Liu, Brent J; Documet, Luis; Huang, H K
2006-07-01
This paper describes a picture archiving and communication system (PACS) tool based on Web technology that remotely manages medical images between a PACS archive and remote destinations. Successfully implemented in a clinical environment and also demonstrated for the past 3 years at the conferences of various organizations, including the Radiological Society of North America, this tool provides a very practical and simple way to manage a PACS, including off-site image distribution and disaster recovery. The application is robust and flexible and can be used on a standard PC workstation or a Tablet PC, but more important, it can be used with a personal digital assistant (PDA). With a PDA, the Web application becomes a powerful wireless and mobile image management tool. The application's quick and easy-to-use features allow users to perform Digital Imaging and Communications in Medicine (DICOM) queries and retrievals with a single interface, without having to worry about the underlying configuration of DICOM nodes. In addition, this frees up dedicated PACS workstations to perform their specialized roles within the PACS workflow. This tool has been used at Saint John's Health Center in Santa Monica, California, for 2 years. The average number of queries per month is 2,021, with 816 C-MOVE retrieve requests. Clinical staff members can use PDAs to manage image workflow and PACS examination distribution conveniently for off-site consultations by referring physicians and radiologists and for disaster recovery. This solution also improves radiologists' effectiveness and efficiency in health care delivery both within radiology departments and for off-site clinical coverage.
Towards building high performance medical image management system for clinical trials
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel
2011-03-01
Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.
Commercial imagery archive, management, exploitation, and distribution project development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-10-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.
Commercial imagery archive, management, exploitation, and distribution product development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
Outsourced central archiving: an information bridge in a multi-IMAC environment
NASA Astrophysics Data System (ADS)
Gustavsson, Staffan; Tylen, Ulf; Carlsson, Goeran; Angelhed, Jan-Erik; Wintell, Mikael; Helmersson, Roger; Norrby, Clas
2001-08-01
In 1998 three hospitals merged to form the Sahlgrenska University Hospital. The total radiology production became 325 000 examinations per year. Two different PACS and RIS with different and incompatible archiving solutions were used since 1996. One PACS had commercial origin and the other was developed inhouse. Together they managed 1/3 of the total production. Due to differences in standard compliance and system architecture the communication was unsatisfactory. In order to improve efficiency, communication and the service level to our customers the situation was evaluated. It was decided to build a transparent virtual radiology department based on a modular approach. A common RIS and a central DICOM image archive as the central nodes in a star configured system were chosen. Web technique was chosen as the solution for distribution of images and reports. The reasons for the decisions as well as the present status of the installation are described and discussed is this paper.
Huang, H K
2011-07-01
The concept of PACS (picture archiving and communication system) was initiated in 1982 during the SPIE medical imaging conference in New Port Beach, CA. Since then PACS has been matured to become an everyday clinical tool for image archiving, communication, display, and review. This paper follows the continuous development of PACS technology including Web-based PACS, PACS and ePR (electronic patient record), enterprise PACS to ePR with image distribution (ID). The concept of large-scale Web-based enterprise PACS and ePR with image distribution is presented along with its implementation, clinical deployment, and operation. The Hong Kong Hospital Authority's (HKHA) integration of its home-grown clinical management system (CMS) with PACS and ePR with image distribution is used as a case study. The current concept and design criteria of the HKHA enterprise integration of the CMS, PACS, and ePR-ID for filmless healthcare delivery are discussed, followed by its work-in-progress and current status.
WFIRST Science Operations at STScI
NASA Astrophysics Data System (ADS)
Gilbert, Karoline; STScI WFIRST Team
2018-06-01
With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.
Knowledge-based image data management - An expert front-end for the BROWSE facility
NASA Technical Reports Server (NTRS)
Stoms, David M.; Star, Jeffrey L.; Estes, John E.
1988-01-01
An intelligent user interface being added to the NASA-sponsored BROWSE testbed facility is described. BROWSE is a prototype system designed to explore issues involved in locating image data in distributed archives and displaying low-resolution versions of that imagery at a local terminal. For prototyping, the initial application is the remote sensing of forest and range land.
Browsing the PDS Image Archive with the Imaging Atlas and Apache Solr
NASA Astrophysics Data System (ADS)
Grimes, K. M.; Padams, J. H.; Stanboli, A.; Wagstaff, K. L.
2018-04-01
The PDS Image Archive is home to tens of millions of images, nearly 30 million of which are associated with rich metadata. By leveraging the Solr indexing technology and the Imaging Atlas interactive frontend, we enable intuitive archive browsing.
[A computer-aided image diagnosis and study system].
Li, Zhangyong; Xie, Zhengxiang
2004-08-01
The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.
Hardwicke, Joseph; Kohlhardt, Angus; Moiemen, Naiem
2015-06-01
The Medical Research Council Burns and Industrial Injuries Unit at the Birmingham Accident Hospital pioneered civilian burn care and research in the United Kingdom during the post-war years. A photographic archive has been discovered that documents this period from 1945 to 1975. The aim of this project was to sort, digitize and archive the images in a secure format for future reference. The photographs detail the management of burns patients, from injury causation and surgical intervention, to nursing care, rehabilitation and long-term follow-up. A total of 2650 images files were collected from over 600 patients. Many novel surgical, nursing, dressing and rehabilitation strategies are documented and discussed. We have chosen to report part of the archive under the sections of (1) aseptic and antimicrobial burn care; (2) burn excision and wound closure; (3) rehabilitation, reconstruction and long-term outcomes; (4) accident prevention; and (5) response to a major burns incident. The Birmingham collection gives us a valuable insight into the approach to civilian burn care in the post-war years, and we present a case from the archive to the modern day, the longest clinical photographic follow-up to date. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
National Satellite Land Remote Sensing Data Archive
Faundeen, John L.; Longhenry, Ryan
2018-06-13
The National Satellite Land Remote Sensing Data Archive is managed on behalf of the Secretary of the Interior by the U.S. Geological Survey’s Earth Resources Observation and Science Center. The Land Remote Sensing Policy Act of 1992 (51 U.S.C. §601) directed the U.S. Department of the Interior to establish a permanent global archive consisting of imagery over land areas obtained from satellites orbiting the Earth. The law also directed the U.S. Department of the Interior, delegated to the U.S. Geological Survey, to ensure proper storage and preservation of imagery, and timely access for all parties. Since 2008, these images have been available at no cost to the user.
Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V
2001-06-01
In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.
2013-01-01
Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934 PMID:23402499
Digital Archival Image Collections: Who Are the Users?
ERIC Educational Resources Information Center
Herold, Irene M. H.
2010-01-01
Archival digital image collections are a relatively new phenomenon in college library archives. Digitizing archival image collections may make them accessible to users worldwide. There has been no study to explore whether collections on the Internet lead to users who are beyond the institution or a comparison of users to a national or…
Production of Previews and Advanced Data Products for the ESO Science Archive
NASA Astrophysics Data System (ADS)
Rité, C.; Slijkhuis, R.; Rosati, P.; Delmotte, N.; Rino, B.; Chéreau, F.; Malapert, J.-C.
2008-08-01
We present a project being carried out by the Virtual Observatory Systems Department/Advanced Data Products group in order to populate the ESO Science Archive Facility with image previews and advanced data products. The main goal is to provide users of the ESO Science Archive Facility with the possibility of viewing pre-processed images associated with instruments like WFI, ISAAC and SOFI before actually retrieving the data for full processing. The image processing is done by using the ESO/MVM image reduction software developed at ESO, to produce astrometrically calibrated FITS images, ranging from simple previews of single archive images, to fully stacked mosaics. These data products can be accessed via the ESO Science Archive Query Form and also be viewed with the browser VirGO {http://archive.eso.org/cms/virgo}.
Peer-to-peer architecture for multi-departmental distributed PACS
NASA Astrophysics Data System (ADS)
Rosset, Antoine; Heuberger, Joris; Pysher, Lance; Ratib, Osman
2006-03-01
We have elected to explore peer-to-peer technology as an alternative to centralized PACS architecture for the increasing requirements for wide access to images inside and outside a radiology department. The goal being to allow users across the enterprise to access any study anytime without the need for prefetching or routing of images from central archive. Images can be accessed between different workstations and local storage nodes. We implemented "bonjour" a new remote file access technology developed by Apple allowing applications to share data and files remotely with optimized data access and data transfer. Our Open-source image display platform called OsiriX was adapted to allow sharing of local DICOM images through direct access of each local SQL database to be accessible from any other OsiriX workstation over the network. A server version of Osirix Core Data database also allows to access distributed archives servers in the same way. The infrastructure implemented allows fast and efficient access to any image anywhere anytime independently from the actual physical location of the data. It also allows benefiting from the performance of distributed low-cost and high capacity storage servers that can provide efficient caching of PACS data that was found to be 10 to 20 x faster that accessing the same date from the central PACS archive. It is particularly suitable for large hospitals and academic environments where clinical conferences, interdisciplinary discussions and successive sessions of image processing are often part of complex workflow or patient management and decision making.
Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.
2009-01-01
In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.
ERIC Educational Resources Information Center
Pettersson, Rune
Different kinds of pictorial databases are described with respect to aims, user groups, search possibilities, storage, and distribution. Some specific examples are given for databases used for the following purposes: (1) labor markets for artists; (2) document management; (3) telling a story; (4) preservation (archives and museums); (5) research;…
Image dissemination and archiving.
Robertson, Ian
2007-08-01
Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.
Medical workstation design: enhancing graphical interface with 3D anatomical atlas
NASA Astrophysics Data System (ADS)
Hoo, Kent S., Jr.; Wong, Stephen T. C.; Grant, Ellen
1997-05-01
The huge data archive of the UCSF Hospital Integrated Picture Archiving and Communication System gives healthcare providers access to diverse kinds of images and text for diagnosis and patient management. Given the mass of information accessible, however, conventional graphical user interface (GUI) approach overwhelms the user with forms, menus, fields, lists, and other widgets and causes 'information overloading.' This article describes a new approach that complements the conventional GUI with 3D anatomical atlases and presents the usefulness of this approach with a clinical neuroimaging application.
Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.
Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray
2003-07-01
The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.
First results of MAO NASU SS bodies photographic archive digitizing
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Andruk, V.; Shatokhina, S.; Golovnya, V.; Yizhakevych, O.; Kulyk, I.
2013-05-01
MAO NASU glass archive encloses about 1800 photographic plates with planets and their satellites (including near 80 images of Uranus, Pluto and Neptune), about 1700 plates with minor planets and about 900 plates with comets. Plates were made during 1949-1999 using 11 telescopes of different focus, mostly the Double Wide-angle Astrograph (F/D=2000/400) and the Double Long-focus Astrograph (F/D=5500/400) of MAO NASU. Observational sites are Kyiv, Lviv (Ukraine), Biurakan (Armenia), Abastumani (Georgia), Mt. Maidanak (Uzbekistan), Quito (Equador). Tables contain data about the most significant numbers of plates sub-divided by years and objects. The database with metadata of plates (DBGPA) is available on the computer cluster of MAO (http://gua.db.ukr-vo.org) via open access. The database accumulates archives of four Ukrainian observatories, involving the UkrVO national project. Together with the archive managing system, the database serves as a test area for JDA - Joint Digital Archive - the core of the UkrVO.
Community archiving of imaging studies
NASA Astrophysics Data System (ADS)
Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita
1996-05-01
The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.
Medical image archive node simulation and architecture
NASA Astrophysics Data System (ADS)
Chiang, Ted T.; Tang, Yau-Kuo
1996-05-01
It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape access time, number of drives, number of exams per patient, number of Central Processing Units, patient grouping, and priority impacts. The MIADS, which could be a key component of a broader data repository system, will be able to communicate with and obtain data from existing hospital information systems. We will discuss the external interfaces enabling MIADS to communicate with and obtain data from existing Radiology Information Systems such as the Picture Archiving and Communication System (PACS). Our system design encompasses the broader aspects of the archive node, which could include multimedia data such as image, audio, video, and free text data. This system is designed to be integrated with current hospital PACS through a Digital Imaging and Communications in Medicine interface. However, the system can also be accessed through the Internet using Hypertext Transport Protocol or Simple File Transport Protocol. Our design and simulation work will be key to implementing a successful, scalable medical image archive and distribution system.
,
2008-01-01
The USGS Landsat archive holds an unequaled 36-year record of the Earth's surface that is invaluable to climate change studies, forest and resource management activities, and emergency response operations. An aggressive effort is taking place to provide all Landsat imagery [scenes currently held in the USGS Earth Resources Observation and Science (EROS) Center archive, as well as newly acquired scenes daily] free of charge to users with electronic access via the Web by the end of December 2008. The entire Landsat 7 Enhanced Thematic Mapper Plus (ETM+) archive acquired since 1999 and any newly acquired Landsat 7 ETM+ images that have less than 40 percent cloud cover are currently available for download. When this endeavor is complete all Landsat 1-5 data will also be available for download. This includes Landsat 1-5 Multispectral Scanner (MSS) scenes, as well as Landsat 4 and 5 Thematic Mapper (TM) scenes.
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.
Present Practice And Perceived Needs-Managing Diagnostic Images
NASA Astrophysics Data System (ADS)
Vanden Brink, John A.
1982-01-01
With the advent of digital radiography and the installed base of CT, Nuclear Medicine and Ultrasound Scanners numbering in the thousands and the potential of NMR, the market potential for the electronic management of digital images is perhaps one of the most exciting, fastest growing (and most ill defined) fields in medicine today. New technology in optical data storage, electronic transmission, image reproduction, microprocessing, automation and software development provide the promise of a whole new generation of products which will simplify and enhance the diagnostic process (thereby hopefully improving diagnostic accuracy), enable implementation of archival review in a practical sense, expand the availability of diagnostic data and lower the cost/case by at least an order of magnitude.
Cost-effectiveness prospects of picture archiving and communication systems.
Hindel, R; Preger, W
1988-01-01
PAC (picture archiving and communication) systems are widely discussed and promoted as the organizational solution to digital image management in a radiology department. For approximately two decades digital imaging has increasingly been used for such diagnostic modalities as CT, DSA, MRI, DR (Digital Radiography) and others. PACS are seen as a step toward high technology integration and more efficient management. Although the acquisition of such technology is investment intensive, there are well-founded projections that prolonged operation will prove cost justified. Such justification can only partly be derived from cost reduction through PAC with respect to present department management--the major justification is preparation for future economic pressures which could make survival of a department without modern technology difficult. Especially in the United States the political climate favors 'competitive medicine' and reduced government support. Seen in this context PACS promises to speed the transition of Health Care Services into a business with tight resource management, cost accounting and marketing. The following paper analyzes cost and revenue in a typical larger Radiology Department, projects various scenarios of cost reduction by means of digital technology and concludes with cautious optimism that the investment expenses for a PACS will be justified in the near future by prudent utilization of high technology.
Investment alternative: the status quo or PACS?
NASA Astrophysics Data System (ADS)
Vanden Brink, John A.; Cywinski, Jozef K.
1990-08-01
While the cost of Picture Archiving and Communication Systems (PACS) can be substantial, the cost of continuing with present manual methods may become prohibitive in growing departments as the need for additional space and personnel (both technical and professional) to meet the increasing requirements for all image management activities continues to grow. This will occur simultaneously with increasing pressures on problems of the present system, i.e., lost films, lost revenues, delayed reporting and longer diagnostic cycle times. Present methods of image archiving communication and management i.e. the relationship of procedure volume to VFE requirements for professional and technical personnel, costs of film, film storage space, and other performance factors are analyzed based on the database created by the Technology Marketing Group (TMG) computerized cost analysis model applied to over 50 US hospitals. Also, the model is used to provide the projected cost of present methods of film management for an average US 400 +bed hospital based on ten year growth rate assumptions. TMG PACS Tracking data provides confirmation of staffmg pattern correlation to procedure volume. The data presented in the paper provides a basis for comparing the investment in maintaining the status quo to an investment in PACS.
Maintaining a legal status for filmless archived digital medical images
NASA Astrophysics Data System (ADS)
Shani, Uri
2001-08-01
Most medical images today are generated digitally before exposure on film. In hospitals that employ Picture Archiving and Communication Systems (PACS), the images are also stored and managed digitally. Indeed, film copies of images are still used at large, but the new generation of filmless hospitals tend to minimize the production of films unless deem necessary, or required by the patients or third parties. There are basically two main reasons for working with films in 'filmless' hospitals. One is that in fact, these are 'less film' hospitals due to the film-oriented environment where they operate. Environment which has not yet entered the PACS and DICOM era; Neither in operation, nor in intercommunication. The other reason is that films are needed for legal purposes as a sole indicator to the medical image evidence used during diagnosis. PACS offer numerous advantages, but a high entry cost which can be balanced with the savings in films production and handling. However, as long as films are mandatory, they do not help to lower the inhibitory cost of PACS, and the use of films prevails.
Mega-precovery and data mining of near-Earth asteroids and other Solar System objects
NASA Astrophysics Data System (ADS)
Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team
2014-07-01
The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.
NASA Astrophysics Data System (ADS)
Wilson, Dennis L.; Glicksman, Robert A.
1994-05-01
A Picture Archiving and Communications System (PACS) must be able to support the image rate of the medical treatment facility. In addition the PACS must have adequate working storage and archive storage capacity required. The calculation of the number of images per minute and the capacity of working storage and of archiving storage is discussed. The calculation takes into account the distribution of images over the different size of radiological images, the distribution between inpatient and outpatient, and the distribution over plain film CR images and other modality images. The support of the indirect clinical image load is difficult to estimate and is considered in some detail. The result of the exercise for a particular hospital is an estimate of the average size of the images and exams on the system, of the number of gigabytes of working storage, of the number of images moved per minute, of the size of the archive in gigabytes, and of the number of images that are to be moved by the archive per minute. The types of storage required to support the image rates and the capacity required are discussed.
TheHiveDB image data management and analysis framework.
Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew
2014-01-06
The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative.
TheHiveDB image data management and analysis framework
Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew
2014-01-01
The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative. PMID:24432000
Kingfisher: a system for remote sensing image database management
NASA Astrophysics Data System (ADS)
Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.
2003-04-01
At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.
Sensor Management for Tactical Surveillance Operations
2007-11-01
active and passive sonar for submarine and tor- pedo detection, and mine avoidance. [range, bearing] range 1.8 km to 55 km Active or Passive AN/SLQ-501...finding (DF) unit [bearing, classification] maximum range 1100 km Passive Cameras (day- light/ night- vision) ( video & still) Record optical and...infrared still images or motion video of events for near-real time assessment or long term analysis and archiving. Range is limited by the image resolution
Distributed file management for remote clinical image-viewing stations
NASA Astrophysics Data System (ADS)
Ligier, Yves; Ratib, Osman M.; Girard, Christian; Logean, Marianne; Trayser, Gerhard
1996-05-01
The Geneva PACS is based on a distributed architecture, with different archive servers used to store all the image files produced by digital imaging modalities. Images can then be visualized on different display stations with the Osiris software. Image visualization require to have the image file physically present on the local station. Thus, images must be transferred from archive servers to local display stations in an acceptable way, which means fast and user friendly where the notion of file must be hidden to users. The transfer of image files is done according to different schemes including prefetching and direct image selection. Prefetching allows the retrieval of previous studies of a patient in advance. A direct image selection is also provided in order to retrieve images on request. When images are transferred locally on the display station, they are stored in Papyrus files, each file containing a set of images. File names are used by the Osiris viewing software to open image sequences. But file names alone are not explicit enough to properly describe the content of the file. A specific utility has been developed to present a list of patients, and for each patient a list of exams which can be selected and automatically displayed. The system has been successfully tested in different clinical environments. It will be soon extended on a hospital wide basis.
A client/server system for Internet access to biomedical text/image databanks.
Thoma, G R; Long, L R; Berman, L E
1996-01-01
Internet access to mixed text/image databanks is finding application in the medical world. An example is a database of medical X-rays and associated data consisting of demographic, socioeconomic, physician's exam, medical laboratory and other information collected as part of a nationwide health survey conducted by the government. Another example is a collection of digitized cryosection images, CT and MR taken of cadavers as part of the National Library of Medicine's Visible Human Project. In both cases, the challenge is to provide access to both the image and the associated text for a wide end user community to create atlases, conduct epidemiological studies, to develop image-specific algorithms for compression, enhancement and other types of image processing, among many other applications. The databanks mentioned above are being created in prototype form. This paper describes the prototype system developed for the archiving of the data and the client software to enable a broad range of end users to access the archive, retrieve text and image data, display the data and manipulate the images. System design considerations include; data organization in a relational database management system with object-oriented extensions; a hierarchical organization of the image data by different resolution levels for different user classes; client design based on common hardware and software platforms incorporating SQL search capability, X Window, Motif and TAE (a development environment supporting rapid prototyping and management of graphic-oriented user interfaces); potential to include ultra high resolution display monitors as a user option; intuitive user interface paradigm for building complex queries; and contrast enhancement, magnification and mensuration tools for better viewing by the user.
Automated Content Detection for Cassini Images
NASA Astrophysics Data System (ADS)
Stanboli, A.; Bue, B.; Wagstaff, K.; Altinok, A.
2017-06-01
NASA missions generate numerous images ever organized in increasingly large archives. Image archives are currently not searchable by image content. We present an automated content detection prototype that can enable content search.
Tools to manage the enterprise-wide picture archiving and communications system environment.
Lannum, L M; Gumpf, S; Piraino, D
2001-06-01
The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.
A Dependable Massive Storage Service for Medical Imaging.
Núñez-Gaona, Marco Antonio; Marcelín-Jiménez, Ricardo; Gutiérrez-Martínez, Josefina; Aguirre-Meneses, Heriberto; Gonzalez-Compean, José Luis
2018-05-18
We present the construction of Babel, a distributed storage system that meets stringent requirements on dependability, availability, and scalability. Together with Babel, we developed an application that uses our system to store medical images. Accordingly, we show the feasibility of our proposal to provide an alternative solution for massive scientific storage and describe the software architecture style that manages the DICOM images life cycle, utilizing Babel like a virtual local storage component for a picture archiving and communication system (PACS-Babel Interface). Furthermore, we describe the communication interface in the Unified Modeling Language (UML) and show how it can be extended to manage the hard work associated with data migration processes on PACS in case of updates or disaster recovery.
Clinical experiences with an ASP model backup archive for PACS images
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Cao, Fei; Documet, Luis; Huang, H. K.; Muldoon, Jean
2003-05-01
Last year we presented a Fault-Tolerant Backup Archive using an Application Service Provider (ASP) model for disaster recovery. The purpose of this paper is to update and provide clinical experiences related towards implementing the ASP model archive solution for short-term backup of clinical PACS image data as well as possible applications other than disaster recovery. The ASP backup archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost and with little human intervention. This solution can be used for a variety of scheduled and unscheduled downtimes that occur on the main PACS archive. A backup archive server with hierarchical storage was implemented offsite from the main PACS archive location. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. We have integrated the ASP model backup archive with a clinical PACS at Saint John's Health Center and has been operational for over 6 months. Pitfalls encountered during integration with a live clinical PACS and the impact to clinical workflow will be discussed. In addition, estimations of the cost of establishing such a solution as well as the cost charged to the users will be included. Clinical downtime scenarios, such as a scheduled mandatory downtime and an unscheduled downtime due to a disaster event to the main archive, were simulated and the PACS exams were sent successfully from the offsite ASP storage server back to the hospital PACS in less than 1 day. The ASP backup archive was able to recover PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery. During any clinical downtime scenario, the ASP backup archive server can repopulate a clinical PACS quickly with the majority of studies available for comparison during the interim until the main PACS archive is fully recovered.
THE PANCHROMATIC STARBURST IRREGULAR DWARF SURVEY (STARBIRDS): OBSERVATIONS AND DATA ARCHIVE
DOE Office of Scientific and Technical Information (OSTI.GOV)
McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D., E-mail: kmcquinn@astro.umn.edu
2015-06-22
Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The datamore » sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging.« less
Bridging the gap between data, publications, and images
NASA Astrophysics Data System (ADS)
Ritchey, N. A.; Collins, D.; Sprain, M.
2017-12-01
NOAA's National Centers for Environmental Information (NCEI) manages the most comprehensive, accessible, and trusted source of environmental data and information in the US. It archives data from the depths of the ocean to the surface of the sun and from million-year-old sediment records to near real-time satellite observations. NCEI has a wealth of knowledge and experience in long-term data preservation with the goal of supporting today's scientists as well as future generations. In order to reduce fragmentation of data, publications, images, and documentation, and to improve preservation, curation, and stewardship of data, NCEI continues to partner with the NOAA Central Library (NCL). NCEI and NCL have long-established linkages between data metadata, published reports, and data or archival information packages (AIP). We also have analog AIPs that are stored and maintained in the NCL collection and discoverable in both NCEI and NCL collections via the AIP identifier. We are currently working with NCL to establish a workflow for submitting reports to their Institutional Repository and linking the data and report via digital object identifiers. We hope to establish linkages between images of physical samples and the NCL Photo Collection management infrastructure in the future. This presentation will detail how NCEI engages with the NCL in order to fully integrate documentation, images, publications, and data in preservation practices and improve the discovery and usability of NOAA's billion dollar investment in environmental data and information.
Image acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Reardon, Frank J.; Salutz, James R.
1991-07-01
The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.
NASA Astrophysics Data System (ADS)
Linneman, S. R.
2017-12-01
Community - Scientist partnerships take many forms. In the northwest corner of Washington state a large, active, serpentinitic earthflow has, for decades, shed >25,000 m^3/yr of asbestos-rich sediment into a small agricultural stream system. While the landslide, which moves 3 m/yr, and its unusual sediment have much attracted scientific interest, the situation also presents a great opportunity for community - scientist partnerships. The Swift Creek Landslide Observatory (SCLO) (http://landslide.geol.wwu.edu) is a partnership between scientists and technical staff at Western Washington University + local landowners + the state Department of Ecology + Whatcom County Public Works + a local video security firm. SCLO maintains two remote webcams from which current images are posted to the SCLO website hourly. Users can also view archived images from the cameras, create image-compare visualizations, and create time-lapse movies from the eight-year image archive. SCLO is used by local emergency managers and residents to evaluate the threat of debris flows and floods. It is also used by educators to dramatically illustrate hillslope evolution at a variety of time scales.
A novel method for efficient archiving and retrieval of biomedical images using MPEG-7
NASA Astrophysics Data System (ADS)
Meyer, Joerg; Pahwa, Ash
2004-10-01
Digital archiving and efficient retrieval of radiological scans have become critical steps in contemporary medical diagnostics. Since more and more images and image sequences (single scans or video) from various modalities (CT/MRI/PET/digital X-ray) are now available in digital formats (e.g., DICOM-3), hospitals and radiology clinics need to implement efficient protocols capable of managing the enormous amounts of data generated daily in a typical clinical routine. We present a method that appears to be a viable way to eliminate the tedious step of manually annotating image and video material for database indexing. MPEG-7 is a new framework that standardizes the way images are characterized in terms of color, shape, and other abstract, content-related criteria. A set of standardized descriptors that are automatically generated from an image is used to compare an image to other images in a database, and to compute the distance between two images for a given application domain. Text-based database queries can be replaced with image-based queries using MPEG-7. Consequently, image queries can be conducted without any prior knowledge of the keys that were used as indices in the database. Since the decoding and matching steps are not part of the MPEG-7 standard, this method also enables searches that were not planned by the time the keys were generated.
Serendipia: Castilla-La Mancha telepathology network
Peces, Carlos; García-Rojo, Marcial; Sacristán, José; Gallardo, Antonio José; Rodríguez, Ambrosio
2008-01-01
Nowadays, there is no standard solution for acquiring, archiving and communication of Pathology digital images. In addition, there does not exist any commercial Pathology Information System (LIS) that can manage the relationship between the reports generated by the pathologist and their corresponding images. Due to this situation, the Healthcare Service of Castilla-La Mancha decided to create a completely digital Pathology Department, the project is called SERENDIPIA. SERENDIPIA project provides all the necessary image acquiring devices needed to cover all kind of images that can be generated in a Pathology Department. In addition, in the SERENDIPIA project an Information System was developed that allows, on the one hand, it to cover the daily workflow of a Pathology Department (including the storage and the manage of the reports and its images), and, on the other hand, the Information System provides a WEB telepathology portal with collaborative tools like second opinion. PMID:18673519
Levine, Betty A; Ingeholm, Mary Lou; Prior, Fred; Mun, Seong K; Freedman, Matthew; Weissman, David; Attfield, Michael; Wolfe, Anita; Petsonk, Edward
2009-01-01
To protect the health of active U.S. underground coal miners, the National Institute for Occupational Safety and Health (NIOSH) has a mandate to carry out surveillance for coal workers' pneumoconiosis, commonly known as Black Lung (PHS 2001). This is accomplished by reviewing chest x-ray films obtained from miners at approximately 5-year intervals in approved x-ray acquisition facilities around the country. Currently, digital chest images are not accepted. Because most chest x-rays are now obtained in digital format, NIOSH is redesigning the surveillance program to accept and manage digital x-rays. This paper highlights the functional and security requirements for a digital image management system for a surveillance program. It also identifies the operational differences between a digital imaging surveillance network and a clinical Picture Archiving Communication Systems (PACS) or teleradiology system.
A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.
Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo
2015-01-01
The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.
The Panchromatic STARBurst IRregular Dwarf Survey (STARBIRDS): Observations and Data Archive
NASA Astrophysics Data System (ADS)
McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D.
2015-06-01
Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The data sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).
A centralized platform for geo-distributed PACS management.
Silva, Luís A Bastião; Pinho, Renato; Ribeiro, Luís S; Costa, Carlos; Oliveira, José Luís
2014-04-01
Picture Archive and Communication System (PACS) is a globally adopted concept and plays a fundamental role in patient care flow within healthcare institutions. However, the deployment of medical imaging repositories over multiple sites still brings several practical challenges namely related to operation and management (O&M). This paper describes a Web-based centralized console that provides remote monitoring, testing, and management over multiple geo-distributed PACS. The system allows the PACS administrator to define any kind of service or operation, reducing the need for local technicians and providing a 24/7 monitoring solution.
Medical image digital archive: a comparison of storage technologies
NASA Astrophysics Data System (ADS)
Chunn, Timothy; Hutchings, Matt
1998-07-01
A cost effective, high capacity digital archive system is one of the remaining key factors that will enable a radiology department to eliminate film as an archive medium. The ever increasing amount of digital image data is creating the need for huge archive systems that can reliably store and retrieve millions of images and hold from a few terabytes of data to possibly hundreds of terabytes. Selecting the right archive solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, conformance to open standards, archive availability and reliability, security, cost, achievable benefits and cost savings, investment protection, and more. This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. New technologies will be discussed, such as DVD and high performance tape. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on random and pre-fetch retrieval time will be analyzed. The concept of automated migration of images from high performance, RAID disk storage devices to high capacity, NearlineR storage devices will be introduced as a viable way to minimize overall storage costs for an archive.
The Role of Archives and Records Management in National Information Systems: A RAMP Study.
ERIC Educational Resources Information Center
Rhoads, James B.
Produced as part of the United Nations Educational, Scientific, and Cultural Organization (UNESCO) Records and Archives Management Programme (RAMP), this publication provides information about the essential character and value of archives and about the procedures and programs that should govern the management of both archives and current records,…
Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2009-01-01
From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Concepts for image management and communication system for space vehicle health management
NASA Astrophysics Data System (ADS)
Alsafadi, Yasser; Martinez, Ralph
On a space vehicle, the Crew Health Care System will handle minor accidents or illnesses immediately, thereby eliminating the necessity of early mission termination or emergency rescue. For practical reasons, only trained personnel with limited medical experience can be available on space vehicles to render preliminary health care. There is the need to communicate with medical experts at different locations on earth. Interplanetary Image Management and Communication System (IIMACS) will be a bridge between worlds and deliver medical images acquired in space to physicians at different medical centers on earth. This paper discusses the implementation of IIMACS by extending the Global Picture Archiving and Communication System (GPACS) being developed to interconnect medical centers on earth. Furthermore, this paper explores system requirements of IIMACS and different user scenarios. Our conclusion is that IIMACS is feasible using the maturing technology base of GPACS.
Enterprise-wide PACS: beyond radiology, an architecture to manage all medical images.
Bandon, David; Lovis, Christian; Geissbühler, Antoine; Vallée, Jean-Paul
2005-08-01
Picture archiving and communication systems (PACS) have the vocation to manage all medical images acquired within the hospital. To address the various situations encountered in the imaging specialties, the traditional architecture used for the radiology department has to evolve. We present our preliminarily results toward an enterprise-wide PACS intended to support all kind of image production in medicine, from biomolecular images to whole-body pictures. Our solution is based on an existing radiologic PACS system from which images are distributed through an electronic patient record to all care facilities. This platform is enriched with a flexible integration framework supporting digital image communication in medicine (DICOM) and DICOM-XML formats. In addition, a generic workflow engine highly customizable is used to drive work processes. Echocardiology; hematology; ear, nose, and throat; and dermatology, including wounds, follow-up is the first implemented extensions outside of radiology. We also propose a global strategy for further developments based on three possible architectures for an enterprise-wide PACS.
ERIC Educational Resources Information Center
Keough, Brian; Wolfe, Mark
2012-01-01
This article discusses integrated approaches to the management and preservation of born digital photography. It examines the changing practices among photographers, and the needed relationships between the photographers using digital technology and the archivists responsible for acquiring their born digital images. Special consideration is given…
Cancer Slide Digital Archive (CDSA) | Informatics Technology for Cancer Research (ITCR)
The CDSA is a web-based platform to support the sharing, managment and analysis of digital pathology data. The Emory Instance currently hosts over 23,000 images from The Cancer Genome Atlas, and the software is being developed within the ITCR grant to be deployable as a digital pathology platform for other labs and Cancer Institutes.
Radiologic image communication and archive service: a secure, scalable, shared approach
NASA Astrophysics Data System (ADS)
Fellingham, Linda L.; Kohli, Jagdish C.
1995-11-01
The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.
NASA Astrophysics Data System (ADS)
Siegel, Eliot L.; Reiner, Bruce I.
2001-08-01
To date, the majority of Picture Archival and Communication Systems (PACS) have been utilized only for capture, storage, and display of radiology and in some cases, nuclear medicine images. Medical images for other subspecialty areas are currently stored in local, independent systems, which typically are not accessible throughout the healthcare enterprise and do not communicate with other hospital information or image management systems. It is likely that during the next few years, healthcare centers will expand PAC system capability to incorporate these multimedia data or alternatively, hospital-wide electronic patient record systems will be able to provide this function.
Content standards for medical image metadata
NASA Astrophysics Data System (ADS)
d'Ornellas, Marcos C.; da Rocha, Rafael P.
2003-12-01
Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.
Perceptual Image Compression in Telemedicine
NASA Technical Reports Server (NTRS)
Watson, Andrew B.; Ahumada, Albert J., Jr.; Eckstein, Miguel; Null, Cynthia H. (Technical Monitor)
1996-01-01
The next era of space exploration, especially the "Mission to Planet Earth" will generate immense quantities of image data. For example, the Earth Observing System (EOS) is expected to generate in excess of one terabyte/day. NASA confronts a major technical challenge in managing this great flow of imagery: in collection, pre-processing, transmission to earth, archiving, and distribution to scientists at remote locations. Expected requirements in most of these areas clearly exceed current technology. Part of the solution to this problem lies in efficient image compression techniques. For much of this imagery, the ultimate consumer is the human eye. In this case image compression should be designed to match the visual capacities of the human observer. We have developed three techniques for optimizing image compression for the human viewer. The first consists of a formula, developed jointly with IBM and based on psychophysical measurements, that computes a DCT quantization matrix for any specified combination of viewing distance, display resolution, and display brightness. This DCT quantization matrix is used in most recent standards for digital image compression (JPEG, MPEG, CCITT H.261). The second technique optimizes the DCT quantization matrix for each individual image, based on the contents of the image. This is accomplished by means of a model of visual sensitivity to compression artifacts. The third technique extends the first two techniques to the realm of wavelet compression. Together these two techniques will allow systematic perceptual optimization of image compression in NASA imaging systems. Many of the image management challenges faced by NASA are mirrored in the field of telemedicine. Here too there are severe demands for transmission and archiving of large image databases, and the imagery is ultimately used primarily by human observers, such as radiologists. In this presentation I will describe some of our preliminary explorations of the applications of our technology to the special problems of telemedicine.
Context indexing of digital cardiac ultrasound records in PACS
NASA Astrophysics Data System (ADS)
Lobodzinski, S. Suave; Meszaros, Georg N.
1998-07-01
Recent wide adoption of the DICOM 3.0 standard by ultrasound equipment vendors created a need for practical clinical implementations of cardiac imaging study visualization, management and archiving, DICOM 3.0 defines only a logical and physical format for exchanging image data (still images, video, patient and study demographics). All DICOM compliant imaging studies must presently be archived on a 650 Mb recordable compact disk. This is a severe limitation for ultrasound applications where studies of 3 to 10 minutes long are a common practice. In addition, DICOM digital echocardiography objects require physiological signal indexing, content segmentation and characterization. Since DICOM 3.0 is an interchange standard only, it does not define how to database composite video objects. The goal of this research was therefore to address the issues of efficient storage, retrieval and management of DICOM compliant cardiac video studies in a distributed PACS environment. Our Web based implementation has the advantage of accommodating both DICOM defined entity-relation modules (equipment data, patient data, video format, etc.) in standard relational database tables and digital indexed video with its attributes in an object relational database. Object relational data model facilitates content indexing of full motion cardiac imaging studies through bi-directional hyperlink generation that tie searchable video attributes and related objects to individual video frames in the temporal domain. Benefits realized from use of bi-directionally hyperlinked data models in an object relational database include: (1) real time video indexing during image acquisition, (2) random access and frame accurate instant playback of previously recorded full motion imaging data, and (3) time savings from faster and more accurate access to data through multiple navigation mechanisms such as multidimensional queries on an index, queries on a hyperlink attribute, free search and browsing.
NASA Technical Reports Server (NTRS)
Callicott, William M.
1993-01-01
The NOAA archives contain 150 terabytes of data in digital form, most of which are the high volume GOES satellite image data. There are 630 data bases containing 2,350 environmental variables. There are 375 million film records and 90 million paper records in addition to the digital data base. The current data accession rate is 10 percent per year and the number of users are increasing at a 10 percent annual rate. NOAA publishes 5,000 publications and distributes over one million copies to almost 41,000 paying customers. Each year, over six million records are key entered from manuscript documents and about 13,000 computer tapes and 40,000 satellite hardcopy images are entered into the archive. Early digital data were stored on punched cards and open reel computer tapes. In the late seventies, an advanced helical scan technology (AMPEX TBM) was implemented. Now, punched cards have disappeared, the TBM system was abandoned, most data stored on open reel tapes have been migrated to 3480 cartridges, many specialized data sets were distributed on CD ROM's, special archives are being copied to 12 inch optical WORM disks, 5 1/4 inch magneto-optical disks were employed for workstation applications, and 8 mm EXABYTE tapes are planned for major data collection programs. The rapid expansion of new data sets, some of which constitute large volumes of data, coupled with the need for vastly improved access mechanisms, portability, and improved longevity are factors which will influence NOAA's future systems approaches for data management.
Utilization of a multimedia PACS workstation for surgical planning of epilepsy
NASA Astrophysics Data System (ADS)
Soo Hoo, Kent; Wong, Stephen T.; Hawkins, Randall A.; Knowlton, Robert C.; Laxer, Kenneth D.; Rowley, Howard A.
1997-05-01
Surgical treatment of temporal lobe epilepsy requires the localization of the epileptogenic zone for surgical resection. Currently, clinicians utilize electroencephalography, various neuroimaging modalities, and psychological tests together to determine the location of this zone. We investigate how a multimedia neuroimaging workstation built on top of the UCSF Picture Archiving and Communication System can be used to aid surgical planning of epilepsy and related brain diseases. This usage demonstrates the ability of the workstation to retrieve image and textural data from PACS and other image sources, register multimodality images, visualize and render 3D data sets, analyze images, generate new image and text data from the analysis, and organize all data in a relational database management system.
A Robust, Low-Cost Virtual Archive for Science Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Vollmer, Bruce
2005-01-01
Despite their expense tape silos are still often the only affordable option for petabytescale science data archives, particularly when other factors such as data reliability, floor space, power and cooling load are accounted for. However, the complexity, management software, hardware reliability and access latency of tape silos make online data storage ever more attractive. Drastic reductions in low-cost mass-market PC disk drivers help to make this more affordable (approx. 1$/GB), but are challenging to scale to the petabyte range and of questionable reliability for archival use, On the other hand, if much of the science archive could be "virtualized", i.e., produced on demand when requested by users, we would need store only a fraction of the data online, perhaps bringing an online-only system into in affordable range. Radiance data from the satellite-borne Moderate Resolution Imaging Spectroradiometer (MODIS) instrument provides a good opportunity for such a virtual archive: the raw data amount to 140 GB/day, but these are small relative to the 550 GB/day making up the radiance products. These data are routinely processed as inputs for geophysical parameter products and then archived on tape at the Goddard Earth Sciences Distributed Active Archive (GES DAAC) for distributing to users. Virtualizing them would be an immediate and signifcant reduction in the amount of data being stored in the tape archives and provide more customizable products. A prototype of such a virtual archive is being developed to prove the concept and develop ways of incorporating the robustness that a science data archive requires.
NASA Astrophysics Data System (ADS)
Driscoll, Brandon; Jaffray, David; Coolens, Catherine
2014-03-01
Purpose: To provide clinicians & researchers participating in multi-centre clinical trials with a central repository for large volume dynamic imaging data as well as a set of tools for providing end-to-end testing and image analysis standards of practice. Methods: There are three main pieces to the data archiving and analysis system; the PACS server, the data analysis computer(s) and the high-speed networks that connect them. Each clinical trial is anonymized using a customizable anonymizer and is stored on a PACS only accessible by AE title access control. The remote analysis station consists of a single virtual machine per trial running on a powerful PC supporting multiple simultaneous instances. Imaging data management and analysis is performed within ClearCanvas Workstation® using custom designed plug-ins for kinetic modelling (The DCE-Tool®), quality assurance (The DCE-QA Tool) and RECIST. Results: A framework has been set up currently serving seven clinical trials spanning five hospitals with three more trials to be added over the next six months. After initial rapid image transfer (+ 2 MB/s), all data analysis is done server side making it robust and rapid. This has provided the ability to perform computationally expensive operations such as voxel-wise kinetic modelling on very large data archives (+20 GB/50k images/patient) remotely with minimal end-user hardware. Conclusions: This system is currently in its proof of concept stage but has been used successfully to send and analyze data from remote hospitals. Next steps will involve scaling up the system with a more powerful PACS and multiple high powered analysis machines as well as adding real-time review capabilities.
Scalable Data Mining and Archiving for the Square Kilometre Array
NASA Astrophysics Data System (ADS)
Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.
2011-12-01
As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.
Archival Information Management System.
1995-02-01
management system named Archival Information Management System (AIMS), designed to meet the audit trail requirement for studies completed under the...are to be archived to the extent that future reproducibility and interrogation of results will exist. This report presents a prototype information
Computers in imaging and health care: now and in the future.
Arenson, R L; Andriole, K P; Avrin, D E; Gould, R G
2000-11-01
Early picture archiving and communication systems (PACS) were characterized by the use of very expensive hardware devices, cumbersome display stations, duplication of database content, lack of interfaces to other clinical information systems, and immaturity in their understanding of the folder manager concepts and workflow reengineering. They were implemented historically at large academic medical centers by biomedical engineers and imaging informaticists. PACS were nonstandard, home-grown projects with mixed clinical acceptance. However, they clearly showed the great potential for PACS and filmless medical imaging. Filmless radiology is a reality today. The advent of efficient softcopy display of images provides a means for dealing with the ever-increasing number of studies and number of images per study. Computer power has increased, and archival storage cost has decreased to the extent that the economics of PACS is justifiable with respect to film. Network bandwidths have increased to allow large studies of many megabytes to arrive at display stations within seconds of examination completion. PACS vendors have recognized the need for efficient workflow and have built systems with intelligence in the management of patient data. Close integration with the hospital information system (HIS)-radiology information system (RIS) is critical for system functionality. Successful implementation of PACS requires integration or interoperation with hospital and radiology information systems. Besides the economic advantages, secure rapid access to all clinical information on patients, including imaging studies, anytime and anywhere, enhances the quality of patient care, although it is difficult to quantify. Medical image management systems are maturing, providing access outside of the radiology department to images and clinical information throughout the hospital or the enterprise via the Internet. Small and medium-sized community hospitals, private practices, and outpatient centers in rural areas will begin realizing the benefits of PACS already realized by the large tertiary care academic medical centers and research institutions. Hand-held devices and the Worldwide Web are going to change the way people communicate and do business. The impact on health care will be huge, including radiology. Computer-aided diagnosis, decision support tools, virtual imaging, and guidance systems will transform our practice as value-added applications utilizing the technologies pushed by PACS development efforts. Outcomes data and the electronic medical record (EMR) will drive our interactions with referring physicians and we expect the radiologist to become the informaticist, a new version of the medical management consultant.
Fault-tolerant back-up archive using an ASP model for disaster recovery
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Huang, H. K.; Cao, Fei; Documet, Luis; Sarti, Dennis A.
2002-05-01
A single point of failure in PACS during a disaster scenario is the main archive storage and server. When a major disaster occurs, it is possible to lose an entire hospital's PACS data. Few current PACS archives feature disaster recovery, but the design is limited at best. These drawbacks include the frequency with which the back-up is physically removed to an offsite facility, the operational costs associated to maintain the back-up, the ease-of-use to perform the backup consistently and efficiently, and the ease-of-use to perform the PACS image data recovery. This paper describes a novel approach towards a fault-tolerant solution for disaster recovery of short-term PACS image data using an Application Service Provider model for service. The ASP back-up archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost. A back-up archive server and RAID storage device is implemented offsite from the main PACS archive location. In the example of this particular hospital, it was determined that at least 2 months worth of PACS image exams were needed for back-up. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. A disaster scenario was simulated and the PACS exams were sent from the offsite ASP storage server back to the hospital PACS. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. A disaster scenario was initiated and the disaster recovery process using the ASP back-up archive server was success in repopulating the clinical PACS within a short period of time. The ASP back-up archive was able to recover two months of PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery.
Development of a system for transferring images via a network: supporting a regional liaison.
Mihara, Naoki; Manabe, Shiro; Takeda, Toshihiro; Shinichirou, Kitamura; Junichi, Murakami; Kouji, Kiso; Matsumura, Yasushi
2013-01-01
We developed a system that transfers images via network and started using them in our hospital's PACS (Picture Archiving and Communication Systems) in 2006. We are pleased to report that the system has been re-developed and has been running so that there will be a regional liaison in the future. It has become possible to automatically transfer images simply by selecting the destination hospital that is registered in advance at the relay server. The gateway of this system can send images to a multi-center, relay management server, which receives the images and resends them. This system has the potential to be useful for image exchange, and to serve as a regional medical liaison.
Optimisation of solar synoptic observations
NASA Astrophysics Data System (ADS)
Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal
2012-09-01
The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.
Imaging and Data Acquisition in Clinical Trials for Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
FitzGerald, Thomas J., E-mail: Thomas.Fitzgerald@umassmed.edu; Bishop-Jodoin, Maryann; Followill, David S.
2016-02-01
Cancer treatment evolves through oncology clinical trials. Cancer trials are multimodal and complex. Assuring high-quality data are available to answer not only study objectives but also questions not anticipated at study initiation is the role of quality assurance. The National Cancer Institute reorganized its cancer clinical trials program in 2014. The National Clinical Trials Network (NCTN) was formed and within it was established a Diagnostic Imaging and Radiation Therapy Quality Assurance Organization. This organization is Imaging and Radiation Oncology Core, the Imaging and Radiation Oncology Core Group, consisting of 6 quality assurance centers that provide imaging and radiation therapy qualitymore » assurance for the NCTN. Sophisticated imaging is used for cancer diagnosis, treatment, and management as well as for image-driven technologies to plan and execute radiation treatment. Integration of imaging and radiation oncology data acquisition, review, management, and archive strategies are essential for trial compliance and future research. Lessons learned from previous trials are and provide evidence to support diagnostic imaging and radiation therapy data acquisition in NCTN trials.« less
NASA Astrophysics Data System (ADS)
Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles
2015-04-01
In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.
The NAS Computational Aerosciences Archive
NASA Technical Reports Server (NTRS)
Miceli, Kristina D.; Globus, Al; Lasinski, T. A. (Technical Monitor)
1995-01-01
In order to further the state-of-the-art in computational aerosciences (CAS) technology, researchers must be able to gather and understand existing work in the field. One aspect of this information gathering is studying published work available in scientific journals and conference proceedings. However, current scientific publications are very limited in the type and amount of information that they can disseminate. Information is typically restricted to text, a few images, and a bibliography list. Additional information that might be useful to the researcher, such as additional visual results, referenced papers, and datasets, are not available. New forms of electronic publication, such as the World Wide Web (WWW), limit publication size only by available disk space and data transmission bandwidth, both of which are improving rapidly. The Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Ames Research Center is in the process of creating an archive of CAS information on the WWW. This archive will be based on the large amount of information produced by researchers associated with the NAS facility. The archive will contain technical summaries and reports of research performed on NAS supercomputers, visual results (images, animations, visualization system scripts), datasets, and any other supporting meta-information. This information will be available via the WWW through the NAS homepage, located at http://www.nas.nasa.gov/, fully indexed for searching. The main components of the archive are technical summaries and reports, visual results, and datasets. Technical summaries are gathered every year by researchers who have been allotted resources on NAS supercomputers. These summaries, together with supporting visual results and references, are browsable by interested researchers. Referenced papers made available by researchers can be accessed through hypertext links. Technical reports are in-depth accounts of tools and applications research projects performed by NAS staff members and collaborators. Visual results, which may be available in the form of images, animations, and/or visualization scripts, are generated by researchers with respect to a certain research project, depicting dataset features that were determined important by the investigating researcher. For example, script files for visualization systems (e.g. FAST, PLOT3D, AVS) are provided to create visualizations on the user's local workstation to elucidate the key points of the numerical study. Users can then interact with the data starting where the investigator left off. Datasets are intended to give researchers an opportunity to understand previous work, 'mine' solutions for new information (for example, have you ever read a paper thinking "I wonder what the helicity density looks like?"), compare new techniques with older results, collaborate with remote colleagues, and perform validation. Supporting meta-information associated with the research projects is also important to provide additional context for research projects. This may include information such as the software used in the simulation (e.g. grid generators, flow solvers, visualization). In addition to serving the CAS research community, the information archive will also be helpful to students, visualization system developers and researchers, and management. Students (of any age) can use the data to study fluid dynamics, compare results from different flow solvers, learn about meshing techniques, etc., leading to better informed individuals. For these users it is particularly important that visualization be integrated into dataset archives. Visualization researchers can use dataset archives to test algorithms and techniques, leading to better visualization systems, Management can use the data to figure what is really going on behind the viewgraphs. All users will benefit from fast, easy, and convenient access to CFD datasets. The CAS information archive hopes to serve as a useful resource to those interested in computational sciences. At present, only information that may be distributed internationally is made available via the archive. Studies are underway to determine security requirements and solutions to make additional information available. By providing access to the archive via the WWW, the process of information gathering can be more productive and fruitful due to ease of access and ability to manage many different types of information. As the archive grows, additional resources from outside NAS will be added, providing a dynamic source of research results.
77 FR 20104 - Privacy Act of 1974, as Amended; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-03
... Archives and Records Administration regulations. System Manager and Address: Director, Enforcement and... retained in accordance with the OCC's records management policies and National Archives and Records... accordance with the OCC's records management policies and National Archives and Records Administration...
Using landsat time-series and lidar to inform aboveground carbon baseline estimation in Minnesota
Ram K. Deo; Grant M. Domke; Matthew B. Russell; Christopher W. Woodall; Michael J. Falkowski
2015-01-01
Landsat data has long been used to support forest monitoring and management decisions despite the limited success of passive optical remote sensing for accurate estimation of structural attributes such as aboveground biomass. The archive of publicly available Landsat images dating back to the 1970s can be used to predict historic forest biomass dynamics. In addition,...
Picture archiving and communication in radiology.
Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale
2003-01-01
After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.
NASA Astrophysics Data System (ADS)
Verma, R. V.
2018-04-01
The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.
The European HST Science Data Archive. [and Data Management Facility (DMF)
NASA Technical Reports Server (NTRS)
Pasian, F.; Pirenne, B.; Albrecht, R.; Russo, G.
1993-01-01
The paper describes the European HST Science Data Archive. Particular attention is given to the flow from the HST spacecraft to the Science Data Archive at the Space Telescope European Coordinating Facility (ST-ECF); the archiving system at the ST-ECF, including the hardware and software system structure; the operations at the ST-ECF and differences with the Data Management Facility; and the current developments. A diagram of the logical structure and data flow of the system managing the European HST Science Data Archive is included.
NASA Astrophysics Data System (ADS)
Schneider, Uwe; Strack, Ruediger
1992-04-01
apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.
Upper Klamath Basin Landsat Image for October 29, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for June 23, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for September 21, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 25, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for November 8, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for September 27, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for August 19, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for August 4, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for October 7, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 9, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for May 6, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for April 30, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for June 1, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for June 17, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-5 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Astronomical Archive at Tartu Observatory
NASA Astrophysics Data System (ADS)
Annuk, K.
2007-10-01
Archiving astronomical data is important task not only at large observatories but also at small observatories. Here we describe the astronomical archive at Tartu Observatory. The archive consists of old photographic plate images, photographic spectrograms, CCD direct--images and CCD spectroscopic data. The photographic plate digitizing project was started in 2005. An on-line database (based on MySQL) was created. The database includes CCD data as well photographic data. A PHP-MySQL interface was written for access to all data.
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean
2004-04-01
An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.
Teaching Electronic Records Management in the Archival Curriculum
ERIC Educational Resources Information Center
Zhang, Jane
2016-01-01
Electronic records management has been incorporated into the archival curriculum in North America since the 1990s. This study reported in this paper provides a systematic analysis of the content of electronic records management (ERM) courses currently taught in archival education programs. Through the analysis of course combinations and their…
Video and LAN solutions for a digital OR: the Varese experience
NASA Astrophysics Data System (ADS)
Nocco, Umberto; Cocozza, Eugenio; Sivo, Monica; Peta, Giancarlo
2007-03-01
Purpose: build 20 ORs equipped with independent video acquisition and broadcasting systems and a powerful LAN connectivity. Methods: a digital PC controlled video matrix has been installed in each OR. The LAN connectivity has been developed to grant data entering the OR and high speed connectivity to a server and to broadcasting devices. Video signals are broadcasted within the OR. Fixed inputs and five additional video inputs have been placed in the OR. Images can be stored locally on a high capacity HDD and a DVD recorder. Images can be also stored in a central archive for future acquisition and reference. Ethernet plugs have been placed within the OR to acquire images and data from the Hospital LAN; the OR is connected to the server/archive using a dedicated optical fiber. Results: 20 independent digital ORs have been built. Each OR is "self contained" and images can be digitally managed and broadcasted. Security issues concerning both image visualization and electrical safety have been fulfilled and each OR is fully integrated in the Hospital LAN. Conclusions: Digital ORs were fully implemented, they fulfill surgeons needs in terms of video acquisition and distribution and grant high quality video for each kind of surgery in a major hospital.
PACS archive upgrade and data migration: clinical experiences
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Sarti, Dennis A.; Huang, H. K.; Donnelly, John
2002-05-01
Saint John's Health Center PACS data volumes have increased dramatically since the hospital became filmless in April of 1999. This is due in part of continuous image accumulation, and the integration of a new multi-slice detector CT scanner into PACS. The original PACS archive would not be able to handle the distribution and archiving load and capacity in the near future. Furthermore, there is no secondary copy backup of all the archived PACS image data for disaster recovery purposes. The purpose of this paper is to present a clinical and technical process template to upgrade and expand the PACS archive, migrate existing PACs image data to the new archive, and provide a back-up and disaster recovery function not currently available. Discussion of the technical and clinical pitfalls and challenges involved in this process will be presented as well. The server hardware configuration was upgraded and a secondary backup implemented for disaster recovery. The upgrade includes new software versions, database reconfiguration, and installation of a new tape jukebox to replace the current MOD jukebox. Upon completion, all PACS image data from the original MOD jukebox was migrated to the new tape jukebox and verified. The migration was performed during clinical operation continuously in the background. Once the data migration was completed the MOD jukebox was removed. All newly acquired PACS exams are now archived to the new tape jukebox. All PACs image data residing on the original MOD jukebox have been successfully migrated into the new archive. In addition, a secondary backup of all PACS image data has been implemented for disaster recovery and has been verified using disaster scenario testing. No PACS image data was lost during the entire process and there was very little clinical impact during the entire upgrade and data migration. Some of the pitfalls and challenges during this upgrade process included hardware reconfiguration for the original archive server, clinical downtime involved with the upgrade, and data migration planning to minimize impact on clinical workflow. The impact was minimized with a downtime contingency plan.
Using an image-extended relational database to support content-based image retrieval in a PACS.
Traina, Caetano; Traina, Agma J M; Araújo, Myrian R B; Bueno, Josiane M; Chino, Fabio J T; Razente, Humberto; Azevedo-Marques, Paulo M
2005-12-01
This paper presents a new Picture Archiving and Communication System (PACS), called cbPACS, which has content-based image retrieval capabilities. The cbPACS answers range and k-nearest- neighbor similarity queries, employing a relational database manager extended to support images. The images are compared through their features, which are extracted by an image-processing module and stored in the extended relational database. The database extensions were developed aiming at efficiently answering similarity queries by taking advantage of specialized indexing methods. The main concept supporting the extensions is the definition, inside the relational manager, of distance functions based on features extracted from the images. An extension to the SQL language enables the construction of an interpreter that intercepts the extended commands and translates them to standard SQL, allowing any relational database server to be used. By now, the system implemented works on features based on color distribution of the images through normalized histograms as well as metric histograms. Metric histograms are invariant regarding scale, translation and rotation of images and also to brightness transformations. The cbPACS is prepared to integrate new image features, based on texture and shape of the main objects in the image.
NASA Astrophysics Data System (ADS)
Angelhed, Jan-Erik; Carlsson, Goeran; Gustavsson, Staffan; Karlsson, Anders; Larsson, Lars E. G.; Svensson, Sune; Tylen, Ulf
1998-07-01
An Image Management And Communication (IMAC) system adapted to the X-ray department at Sahlgrenska University Hospital has been developed using standard components. Two user demands have been considered primary: Rapid access to (display of) images and an efficient worklist management. To fulfil these demands a connection between the IMAC system and the existing Radiological Information System (RIS) has been implemented. The functional modules are: check of information consistency in data exported from image sources, a (logically) central storage of image data, viewing facility for high speed-, large volume-, clinical work, and an efficient interface to the RIS. Also, an image related database extension has been made to the RIS. The IMAC system has a strictly modular design with a simple structure. The image archive and short term storage are logically the same and acts as a huge disk. Through NFS all image data is available to all the connected workstations. All patient selection for viewing is through worklists, which are created by selection criteria in the RIS, by the use of barcodes, or, in singular cases, by entering the patient ID by hand.
CruiseViewer: SIOExplorer Graphical Interface to Metadata and Archives.
NASA Astrophysics Data System (ADS)
Sutton, D. W.; Helly, J. J.; Miller, S. P.; Chase, A.; Clark, D.
2002-12-01
We are introducing "CruiseViewer" as a prototype graphical interface for the SIOExplorer digital library project, part of the overall NSF National Science Digital Library (NSDL) effort. When complete, CruiseViewer will provide access to nearly 800 cruises, as well as 100 years of documents and images from the archives of the Scripps Institution of Oceanography (SIO). The project emphasizes data object accessibility, a rich metadata format, efficient uploading methods and interoperability with other digital libraries. The primary function of CruiseViewer is to provide a human interface to the metadata database and to storage systems filled with archival data. The system schema is based on the concept of an "arbitrary digital object" (ADO). Arbitrary in that if the object can be stored on a computer system then SIOExplore can manage it. Common examples are a multibeam swath bathymetry file, a .pdf cruise report, or a tar file containing all the processing scripts used on a cruise. We require a metadata file for every ADO in an ascii "metadata interchange format" (MIF), which has proven to be highly useful for operability and extensibility. Bulk ADO storage is managed using the Storage Resource Broker, SRB, data handling middleware developed at the San Diego Supercomputer Center that centralizes management and access to distributed storage devices. MIF metadata are harvested from several sources and housed in a relational (Oracle) database. For CruiseViewer, cgi scripts resident on an Apache server are the primary communication and service request handling tools. Along with the CruiseViewer java application, users can query, access and download objects via a separate method that operates through standard web browsers, http://sioexplorer.ucsd.edu. Both provide the functionability to query and view object metadata, and select and download ADOs. For the CruiseViewer application Java 2D is used to add a geo-referencing feature that allows users to select basemap images and have vector shapes representing query results mapped over the basemap in the image panel. The two methods together address a wide range of user access needs and will allow for widespread use of SIOExplorer.
Problem of data quality and the limitations of the infrastructure approach
NASA Astrophysics Data System (ADS)
Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong
1998-07-01
The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.
Remote sensing data acquisition, analysis and archival. Volume 1. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stringer, W.J.; Dean, K.G.; Groves, J.E.
1993-03-25
The project specialized in the acquisition and dissemination of satellite imagery and its utilization for case-specific and statistical analyses of offshore environmental conditions, particularly those involving sea ice. During the duration of this contract, 854 Landsat Multispectral Scanner and 2 Landsat Thematic Mapper scenes, 8,576 Advanced Very High Resolution Radiometer images, and 31,000 European, Earth Resources Satellite, Synthetic Aperture Radar images were archived. Direct assistance was provided to eight Minerals Management Service (MMS)-sponsored studies, including analyses of Port Moller circulation, Bowhead whale migration, distribution, population and behavioral studies, Beaufort Sea fisheries, oil spill trajectory model development, and Kasegaluk Lagoon environmentalmore » assessments. In addition, under this Cooperative Agreement several complete studies were undertaken based on analysis of satellite imagery. The topics included: Kasegaluk Lagoon transport, the effect of winter storms on arctic ice, the relationship between ice surface temperatures as measured by buoys and passive microwave imagery, unusual cloud forms following lead-openings, and analyses of Chukchi and Bering sea polynyas.« less
An implementation of wireless medical image transmission system on mobile devices.
Lee, SangBock; Lee, Taesoo; Jin, Gyehwan; Hong, Juhyun
2008-12-01
The advanced technology of computing system was followed by the rapid improvement of medical instrumentation and patient record management system. The typical examples are hospital information system (HIS) and picture archiving and communication system (PACS), which computerized the management procedure of medical records and images in hospital. Because these systems were built and used in hospitals, doctors out of hospital have problems to access them immediately on emergent cases. To solve these problems, this paper addressed the realization of system that could transmit the images acquired by medical imaging systems in hospital to the remote doctors' handheld PDA's using CDMA cellular phone network. The system consists of server and PDA. The server was developed to manage the accounts of doctors and patients and allocate the patient images to each doctor. The PDA was developed to display patient images through remote server connection. To authenticate the personal user, remote data access (RDA) method was used in PDA accessing the server database and file transfer protocol (FTP) was used to download patient images from the remove server. In laboratory experiments, it was calculated to take ninety seconds to transmit thirty images with 832 x 488 resolution and 24 bit depth and 0.37 Mb size. This result showed that the developed system has no problems for remote doctors to receive and review the patient images immediately on emergent cases.
PDS Archive Release of Apollo 11, Apollo 12, and Apollo 17 Lunar Rock Sample Images
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Stefanov, W. L.; Lofgren, G. E.; Todd, N. S.; Gaddis, L. R.
2013-01-01
Scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory, Information Resources Directorate, and Image Science & Analysis Laboratory have been working to digitize (scan) the original film negatives of Apollo Lunar Rock Sample photographs [1, 2]. The rock samples, and associated regolith and lunar core samples, were obtained during the Apollo 11, 12, 14, 15, 16 and 17 missions. The images allow scientists to view the individual rock samples in their original or subdivided state prior to requesting physical samples for their research. In cases where access to the actual physical samples is not practical, the images provide an alternate mechanism for study of the subject samples. As the negatives are being scanned, they have been formatted and documented for permanent archive in the NASA Planetary Data System (PDS). The Astromaterials Research and Exploration Science Directorate (which includes the Lunar Sample Laboratory and Image Science & Analysis Laboratory) at JSC is working collaboratively with the Imaging Node of the PDS on the archiving of these valuable data. The PDS Imaging Node is now pleased to announce the release of the image archives for Apollo missions 11, 12, and 17.
Content-based retrieval of historical Ottoman documents stored as textual images.
Saykol, Ediz; Sinop, Ali Kemal; Güdükbay, Ugur; Ulusoy, Ozgür; Cetin, A Enis
2004-03-01
There is an accelerating demand to access the visual content of documents stored in historical and cultural archives. Availability of electronic imaging tools and effective image processing techniques makes it feasible to process the multimedia data in large databases. In this paper, a framework for content-based retrieval of historical documents in the Ottoman Empire archives is presented. The documents are stored as textual images, which are compressed by constructing a library of symbols occurring in a document, and the symbols in the original image are then replaced with pointers into the codebook to obtain a compressed representation of the image. The features in wavelet and spatial domain based on angular and distance span of shapes are used to extract the symbols. In order to make content-based retrieval in historical archives, a query is specified as a rectangular region in an input image and the same symbol-extraction process is applied to the query region. The queries are processed on the codebook of documents and the query images are identified in the resulting documents using the pointers in textual images. The querying process does not require decompression of images. The new content-based retrieval framework is also applicable to many other document archives using different scripts.
Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A
2016-01-01
The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.
The Orthanc Ecosystem for Medical Imaging.
Jodogne, Sébastien
2018-05-03
This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.
The Next Landsat Satellite: The Landsat Data Continuity Mission
NASA Technical Reports Server (NTRS)
Rons, James R.; Dwyer, John L.; Barsi, Julia A.
2012-01-01
The Landsat program is one of the longest running satellite programs for Earth observations from space. The program was initiated by the launch of Landsat 1 in 1972. Since then a series of six more Landsat satellites were launched and at least one of those satellites has been in operations at all times to continuously collect images of the global land surface. The Department of Interior (DOI) U.S. Geological Survey (USGS) preserves data collected by all of the Landsat satellites at their Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. This 40-year data archive provides an unmatched record of the Earth's land surface that has undergone dramatic changes in recent decades due to the increasing pressure of a growing population and advancing technologies. EROS provides the ability for anyone to search the archive and order digital Landsat images over the internet for free. The Landsat data are a public resource for observing, characterizing, monitoring, trending, and predicting land use change over time providing an invaluable tool for those addressing the profound consequences of those changes to society. The most recent launch of a Landsat satellite occurred in 1999 when Landsat 7 was placed in orbit. While Landsat 7 remains in operation, the National Aeronautics and Space Administration (NASA) and the DOI/ USGS are building its successor satellite system currently called the Landsat Data Continuity Mission (LDCM). NASA has the lead for building and launching the satellite that will carry two Earth-viewing instruments, the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The OLI will take images that measure the amount of sunlight reflected by the land surface at nine wavelengths of light with three of those wavelengths beyond the range of human vision. T1RS will collect coincident images that measure light emitted by the land surface as a function of surface temperature at two longer wavelengths well beyond the range of human vision. The DOI/USGS is developing the ground system that will command and control the LDCM satellite in orbit and manage the OLI and TIRS data transmitted by the satellite. DOI/USGS will thus operate the satellite and collect, archive, and distribute the image data as part of the EROS archive. DOI/USGS has committed to renaming LDCM as Landsat 8 following launch. By either name the satellite and its sensors will extend the 40-year archive with images sufficiently consistent with data from earlier Landsat satellites to allow multi-decadal, broad-area studies of our dynamic landscapes. The next Landsat satellite and ground system are on schedule for a January, 2013 launch.
ERIC Educational Resources Information Center
Krishnamurthy, Ramesh S.; Mead, Clifford S.
1995-01-01
Presents plan of Oregon State University Libraries to convert all paper documents from the Ava Helen and Linus Pauling archives to digital format. The scope, goals, tasks and objectives set by the project coordinators are outlined, and issues such as protection of equipment, access, copyright and management are discussed. (JKP)
NASA Astrophysics Data System (ADS)
Mun, Seong K.; Freedman, Matthew T.; Gelish, Anthony; de Treville, Robert E.; Sheehy, Monet R.; Hansen, Mark; Hill, Mac; Zacharia, Elisabeth; Sullivan, Michael J.; Sebera, C. Wayne
1993-01-01
Image management and communications (IMAC) network, also known as picture archiving and communication system (PACS) consists of (1) digital image acquisition, (2) image review station (3) image storage device(s), image reading workstation, and (4) communication capability. When these subsystems are integrated over a high speed communication technology, possibilities are numerous in improving the timeliness and quality of diagnostic services within a hospital or at remote clinical sites. Teleradiology system uses basically the same hardware configuration together with a long distance communication capability. Functional characteristics of components are highlighted. Many medical imaging systems are already in digital form. These digital images constitute approximately 30% of the total volume of images produced in a radiology department. The remaining 70% of images include conventional x-ray films of the chest, skeleton, abdomen, and GI tract. Unless one develops a method of handling these conventional film images, global improvement in productivity in image management and radiology service throughout a hospital cannot be achieved. Currently, there are two method of producing digital information representing these conventional analog images for IMAC: film digitizers that scan the conventional films, and computed radiography (CR) that captures x-ray images using storage phosphor plate that is subsequently scanned by a laser beam.
A comprehensive cost model for NASA data archiving
NASA Technical Reports Server (NTRS)
Green, J. L.; Klenk, K. F.; Treinish, L. A.
1990-01-01
A simple archive cost model has been developed to help predict NASA's archiving costs. The model covers data management activities from the beginning of the mission through launch, acquisition, and support of retrospective users by the long-term archive; it is capable of determining the life cycle costs for archived data depending on how the data need to be managed to meet user requirements. The model, which currently contains 48 equations with a menu-driven user interface, is available for use on an IBM PC or AT.
Taking digital imaging to the next level: challenges and opportunities.
Hobbs, W Cecyl
2004-01-01
New medical imaging technology, such as multi-detector computed tomography (CT) scanners and positron emission tomography (PET) scanners, are creating new possibilities for non-invasive diagnosis that are leading providers to invest heavily in these new technologies. The volume of data produced by such technology is so large that it cannot be "read" using traditional film-based methods, and once in digital form, it creates a massive data integration and archiving challenge. Despite the benefits of digital imaging and archiving, there are several key challenges that healthcare organizations should consider in planning, selecting, and implementing the information technology (IT) infrastructure to support digital imaging. Decisions about storage and image distribution are essentially questions of "where" and "how fast." When planning the digital archiving infrastructure, organizations should think about where they want to store and distribute their images. This is similar to decisions that organizations have to make in regard to physical film storage and distribution, except the portability of images is even greater in a digital environment. The principle of "network effects" seems like a simple concept, yet the effect is not always considered when implementing a technology plan. To fully realize the benefits of digital imaging, the radiology department must integrate the archiving solutions throughout the department and, ultimately, with applications across other departments and enterprises. Medical institutions can derive a number of benefits from implementing digital imaging and archiving solutions like PACS. Hospitals and imaging centers can use the transition from film-based imaging as a foundational opportunity to reduce costs, increase competitive advantage, attract talent, and improve service to patients. The key factors in achieving these goals include attention to the means of data storage, distribution and protection.
Cooperative observation data center for planets: starting with the Mars 2009-2010 observation
NASA Astrophysics Data System (ADS)
Nakakushi, T.; Okyudo, M.; Tomita, A.
2009-12-01
We propose in this paper a plan to construct a planetary image data center on the internet, which links professional researchers and amateur observers all over the world. Such data archive projects have worked, at least for Mars. Since 2003, one of the authors (T. N.) have started a project to summarize Mars observations using such cooperative network observation data archives and to publish the summary as professional research papers (Nakakushi et al., 2004, 2005, and 2008). Planetary atmosphere varies in various timescales, which requires temporarily continuous observations. Cooperative observation which amateur observers join can keep the observation continuous and sustainable, so that it can be a strong weapon to reveal planetary climate and meteorology. For outer planets, in particular, we don't know synoptic "seasonal" variations because of their long periods of revolution. We need steady and persistent effort to accumulate observations. That is why we need amateur observers' high-level observation techniques. To do so, we also needs systems to provide (and reproduce) data for users in an appropriate manner. We start from Mars with our own new date archive website, because we have much experience in terms of Mars. Next, we will expand the system for all the planets. Roughly said, there will be 3 steps to expand the project to all the planets: (1) to construct our own Mars cooperative observation data center, (2) to link it with professional studies, (3) to construct cooperative observation data center for all planets. And 4 problems to tackle: (1) to develop web interfaces for users to submit data, (2) to develop interfaces for managers, (3) to secure finances, (4) to secure professional researchers. 2009 and 2010 are a good apparition for Mars observation. We manage the Mars image data website, find problems and solutions in detail, and search for ways to expand it to all the planet and to enable sustainable management.
Tobin, Kenneth W; Karnowski, Thomas P; Chaum, Edward
2013-08-06
A method for diagnosing diseases having retinal manifestations including retinal pathologies includes the steps of providing a CBIR system including an archive of stored digital retinal photography images and diagnosed patient data corresponding to the retinal photography images, the stored images each indexed in a CBIR database using a plurality of feature vectors, the feature vectors corresponding to distinct descriptive characteristics of the stored images. A query image of the retina of a patient is obtained. Using image processing, regions or structures in the query image are identified. The regions or structures are then described using the plurality of feature vectors. At least one relevant stored image from the archive based on similarity to the regions or structures is retrieved, and an eye disease or a disease having retinal manifestations in the patient is diagnosed based on the diagnosed patient data associated with the relevant stored image(s).
Liu, Yan-Lin; Shih, Cheng-Ting; Chang, Yuan-Jen; Chang, Shu-Jun; Wu, Jay
2014-01-01
The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment.
Chang, Shu-Jun; Wu, Jay
2014-01-01
The rapid development of picture archiving and communication systems (PACSs) thoroughly changes the way of medical informatics communication and management. However, as the scale of a hospital's operations increases, the large amount of digital images transferred in the network inevitably decreases system efficiency. In this study, a server cluster consisting of two server nodes was constructed. Network load balancing (NLB), distributed file system (DFS), and structured query language (SQL) duplication services were installed. A total of 1 to 16 workstations were used to transfer computed radiography (CR), computed tomography (CT), and magnetic resonance (MR) images simultaneously to simulate the clinical situation. The average transmission rate (ATR) was analyzed between the cluster and noncluster servers. In the download scenario, the ATRs of CR, CT, and MR images increased by 44.3%, 56.6%, and 100.9%, respectively, when using the server cluster, whereas the ATRs increased by 23.0%, 39.2%, and 24.9% in the upload scenario. In the mix scenario, the transmission performance increased by 45.2% when using eight computer units. The fault tolerance mechanisms of the server cluster maintained the system availability and image integrity. The server cluster can improve the transmission efficiency while maintaining high reliability and continuous availability in a healthcare environment. PMID:24701580
Larsen, Dana M.
1993-01-01
The EROS Data Center has managed to National Satellite Land Remote Sensing Data Archive's (NSLRSDA) Landsat data since 1972. The NSLRSDA includes Landsat MSS data from 1972 through 1991 and T M data from 1982 through 1993. In response to many requests from multi-disciplined users for an enhanced insight into the availability and volume of Landsat data over specific worldwide land areas, numerous world plots and corresponding statical overviews have been prepared. These presentations include information related to image quality, cloud cover, various types of data overage (i.e. regions, countries, path, rows), acquisition station coverage areas, various archive media formats (i.e. wide band video tapes, computer compatible tapes, high density tapes, etc.) and acquisition time periods (i.e. years, seasons). Plans are to publish this information in a paper sample booklet at the Pecora 12 Symposium, in a USGS circular and on a Landsat CD-ROM; the data will be also be incorporated into GLIS.
Upper Klamath Basin Landsat Image for September 30, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 18, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for August 29, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 28, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for October 22, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for August 19, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for October 16, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for September 20, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for June 26, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for April 29, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 12, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 2, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for May 25, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for June 16, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for April 7, 2004: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-5 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-5 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-5 on March 1, 1984 marks the addition of the fifth satellite to the Landsat series. The Landsat-5 satellite carries the Thematic Mapper (TM) sensor. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
50 CFR 635.33 - Archival tags.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 12 2013-10-01 2013-10-01 false Archival tags. 635.33 Section 635.33 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE ATLANTIC HIGHLY MIGRATORY SPECIES Management Measures § 635.33 Archival tags. (a...
Landsat View: Chandler, Arizona
2017-12-08
Over the last 25 years, Chandler, Arizona has traded its grid of fields for a grid of streets. Founded in 1912 on cotton, grains, alfalfa, and ostrich farms, brown and green irrigated fields still dominate the region southeast of Phoenix in this 1985 natural color image taken by Landsat 5. By 2011, the blue gray city streets in this Landsat 5 image have taken over. Chandler's economy has shifted from agriculture to manufacturing and electronics, and its population boomed from 30,000 people in 1980 to 236,000 in 2010. ---- Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Georgiou, Mike F.; Sfakianakis, George N.; Johnson, Gary; Douligeris, Christos; Scandar, Silvia; Eisler, E.; Binkley, B.
1994-05-01
In an effort to improve patient care while considering cost-effectiveness, we developed a Picture Archiving and Communication System (PACS), which combines imaging cameras, computers and other peripheral equipment from multiple nuclear medicine vectors. The PACS provides fully-digital clinical operation which includes acquisition and automatic organization of patient data, distribution of the data to all networked units inside the department and other remote locations, digital analysis and quantitation of images, digital diagnostic reading of image studies and permanent data archival with the ability for fast retrieval. The PACS enabled us to significantly reduce the amount of film used, and we are currently proceeding with implementing a film-less laboratory. Hard copies are produced on paper or transparent sheets for non-digitally connected parts of the hospital. The PACS provides full-digital operation which is faster, more reliable, better organized and managed, and overall more efficient than a conventional film-based operation. In this paper, the integration of the various PACS components from multiple vendors is reviewed, and the impact of PACS, with its advantages and limitations on our clinical operation is analyzed.
NVST Data Archiving System Based On FastBit NoSQL Database
NASA Astrophysics Data System (ADS)
Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo
2014-06-01
The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.
Remotely Sensed Imagery from USGS: Update on Products and Portals
NASA Astrophysics Data System (ADS)
Lamb, R.; Lemig, K.
2016-12-01
The USGS Earth Resources Observation and Science (EROS) Center has recently implemented a number of additions and changes to its existing suite of products and user access systems. Together, these changes will enhance the accessibility, breadth, and usability of the remotely sensed image products and delivery mechanisms available from USGS. As of late 2016, several new image products are now available for public download at no charge from USGS/EROS Center. These new products include: (1) global Level 1T (precision terrain-corrected) products from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), provided via NASA's Land Processes Distributed Active Archive Center (LP DAAC); and (2) Sentinel-2 Multispectral Instrument (MSI) products, available through a collaborative effort with the European Space Agency (ESA). Other new products are also planned to become available soon. In an effort to enable future scientific analysis of the full 40+ year Landsat archive, the USGS also introduced a new "Collection Management" strategy for all Landsat Level 1 products. This new archive and access schema involves quality-based tier designations that will support future time series analysis of the historic Landsat archive at the pixel level. Along with the quality tier designations, the USGS has also implemented a number of other Level 1 product improvements to support Landsat science applications, including: enhanced metadata, improved geometric processing, refined quality assessment information, and angle coefficient files. The full USGS Landsat archive is now being reprocessed in accordance with the new `Collection 1' specifications. Several USGS data access and visualization systems have also seen major upgrades. These user interfaces include a new version of the USGS LandsatLook Viewer which was released in Fall 2017 to provide enhanced functionality and Sentinel-2 visualization and access support. A beta release of the USGS Global Visualization Tool ("GloVis Next") was also released in Fall 2017, with many new features including data visualization at full resolution. The USGS also introduced a time-enabled web mapping service (WMS) to support time-based access to the existing LandsatLook "natural color" full-resolution browse image services.
Expanding the PACS archive to support clinical review, research, and education missions
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Frost, Meryll M.; Drane, Walter E.
1999-07-01
Designing an image archive and retrieval system that supports multiple users with many different requirements and patterns of use without compromising the performance and functionality required by diagnostic radiology is an intellectual and technical challenge. A diagnostic archive, optimized for performance when retrieving diagnostic images for radiologists needed to be expanded to support a growing clinical review network, the University of Florida Brain Institute's demands for neuro-imaging, Biomedical Engineering's imaging sciences, and an electronic teaching file. Each of the groups presented a different set of problems for the designers of the system. In addition, the radiologists did not want to see nay loss of performance as new users were added.
Initial experience with a radiology imaging network to newborn and intensive care units.
Witt, R M; Cohen, M D; Appledorn, C R
1991-02-01
A digital image network has been installed in the James Whitcomb Riley Hospital for Children on the Indiana University Medical Center to create a limited all digital imaging system. The system is composed of commercial components, Philips/AT&T CommView system, (Philips Medical Systems, Shelton, CT; AT&T Bell Laboratories, West Long Beach, NJ) and connects an existing Philips Computed Radiology (PCR) system to two remote workstations that reside in the intensive care unit and the newborn nursery. The purpose of the system is to display images obtained from the PCR system on the remote workstations for direct viewing by referring clinicians, and to reduce many of their visits to the radiology reading room three floors away. The design criteria includes the ability to centrally control all image management functions on the remote workstations to relieve the clinicians from any image management tasks except for recalling patient images. The principal components of the system are the Philips PCR system, the acquisition module (AM), and the PCR interface to the Data Management Module (DMM). Connected to the DMM are an Enhanced Graphics Display Workstation (EGDW), an optical disk drive, and a network gateway to an ethernet link. The ethernet network is the connection to the two Results Viewing Stations (RVS) and both RVSs are approximately 100 m from the gateway. The DMM acts as an image file server and an image archive device. The DMM manages the image data base and can load images to the EGDW and the two RVSs. The system has met the initial design specifications and can successfully capture images from the PCR and direct them to the RVSs.(ABSTRACT TRUNCATED AT 250 WORDS)
Metadata and Buckets in the Smart Object, Dumb Archive (SODA) Model
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Croom, Delwin R., Jr.; Robbins, Steven W.
2004-01-01
We present the Smart Object, Dumb Archive (SODA) model for digital libraries (DLs), and discuss the role of metadata in SODA. The premise of the SODA model is to "push down" many of the functionalities generally associated with archives into the data objects themselves. Thus the data objects become "smarter", and the archives "dumber". In the SODA model, archives become primarily set managers, and the objects themselves negotiate and handle presentation, enforce terms and conditions, and perform data content management. Buckets are our implementation of smart objects, and da is our reference implementation for dumb archives. We also present our approach to metadata translation for buckets.
PACS for Bhutan: a cost effective open source architecture for emerging countries.
Ratib, Osman; Roduit, Nicolas; Nidup, Dechen; De Geer, Gerard; Rosset, Antoine; Geissbuhler, Antoine
2016-10-01
This paper reports the design and implementation of an innovative and cost-effective imaging management infrastructure suitable for radiology centres in emerging countries. It was implemented in the main referring hospital of Bhutan equipped with a CT, an MRI, digital radiology, and a suite of several ultrasound units. They lacked the necessary informatics infrastructure for image archiving and interpretation and needed a system for distribution of images to clinical wards. The solution developed for this project combines several open source software platforms in a robust and versatile archiving and communication system connected to analysis workstations equipped with a FDA-certified version of the highly popular Open-Source software. The whole system was implemented on standard off-the-shelf hardware. The system was installed in three days, and training of the radiologists as well as the technical and IT staff was provided onsite to ensure full ownership of the system by the local team. Radiologists were rapidly capable of reading and interpreting studies on the diagnostic workstations, which had a significant benefit on their workflow and ability to perform diagnostic tasks more efficiently. Furthermore, images were also made available to several clinical units on standard desktop computers through a web-based viewer. • Open source imaging informatics platforms can provide cost-effective alternatives for PACS • Robust and cost-effective open architecture can provide adequate solutions for emerging countries • Imaging informatics is often lacking in hospitals equipped with digital modalities.
A Survey of Videodisc Technology.
1985-12-01
store images and the microcomputer is used as an interactive and management tool , makes for a powerful teaching system. General Motors was the first...videodisc are used for archival storage of documents. * IBM uses videodisc in over 180 branch offices where they are used both as a presentation tool and to...provide reference material. IBM is also currently working on a videodisc project as a direct training tool for mainten- ance of their computers. A
Kocna, P
1995-01-01
GastroBase, a clinical information system, incorporates patient identification, medical records, images, laboratory data, patient history, physical examination, and other patient-related information. Program modules are written in C; all data is processed using Novell-Btrieve data manager. Patient identification database represents the main core of this information systems. A graphic library developed in the past year and graphic modules with a special video-card enables the storing, archiving, and linking of different images to the electronic patient-medical-record. GastroBase has been running for more than four years in daily routine and the database contains more than 25,000 medical records and 1,500 images. This new version of GastroBase is now incorporated into the clinical information system of University Clinic in Prague.
36 CFR 1220.10 - Who is responsible for records management?
Code of Federal Regulations, 2014 CFR
2014-07-01
... records management? 1220.10 Section 1220.10 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL § 1220.10 Who is responsible for records management? (a) The National Archives and Records Administration (NARA) is responsible for...
36 CFR 1220.10 - Who is responsible for records management?
Code of Federal Regulations, 2012 CFR
2012-07-01
... records management? 1220.10 Section 1220.10 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL § 1220.10 Who is responsible for records management? (a) The National Archives and Records Administration (NARA) is responsible for...
36 CFR 1220.10 - Who is responsible for records management?
Code of Federal Regulations, 2010 CFR
2010-07-01
... records management? 1220.10 Section 1220.10 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL § 1220.10 Who is responsible for records management? (a) The National Archives and Records Administration (NARA) is responsible for...
36 CFR 1220.10 - Who is responsible for records management?
Code of Federal Regulations, 2011 CFR
2011-07-01
... records management? 1220.10 Section 1220.10 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL § 1220.10 Who is responsible for records management? (a) The National Archives and Records Administration (NARA) is responsible for...
NASA Astrophysics Data System (ADS)
Miller, C. J.; Gasson, D.; Fuentes, E.
2007-10-01
The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).
ERIC Educational Resources Information Center
Chang, May
2000-01-01
Describes the development of electronic finding aids for archives at the University of Illinois, Urbana-Champaign that used XML (extensible markup language) and EAD (encoded archival description) to enable more flexible information management and retrieval than using MARC or a relational database management system. EAD template is appended.…
Local area networks in an imaging environment.
Noz, M E; Maguire, G Q; Erdman, W A
1986-01-01
There is great interest at present in incorporating image-management systems popularly referred to as picture archiving and communication systems (PACS) into imaging departments. This paper will describe various aspects of local area networks (LANs) for medical images and will give a definition of terms and classification of devices by describing a possible system which links various digital image sources through a high-speed data link and a common image format, allows for viewing and processing of all images produced within the complex, and eliminates the transport of films. The status of standards governing LAN and particularly PACS systems along with a proposed image exchange format will be given. Prototype systems, particularly a system for nuclear medicine images, will be presented, as well as the prospects for the immediate future in terms of installations started and commercial products available. A survey of the many questions that arise in the development of a PACS for medical images and also a survey of the presently suggested/adopted answers will be given.
Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.
2009-01-01
In April and July of 1981, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Alabama-Mississippi-Louisiana Shelf in the northern Gulf of Mexico. Work was conducted onboard the Texas A&M University R/V Carancahua and the R/V Gyre to develop a geologic understanding of the study area and to locate potential hazards related to offshore oil and gas production. While the R/V Carancahua only collected boomer data, the R/V Gyre used a 400-Joule minisparker, 3.5-kilohertz (kHz) subbottom profiler, 12-kHz precision depth recorder, and two air guns. The authors selected the minisparker data set because, unlike with the boomer data, it provided the most complete record. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer and minisparker paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.
Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys
NASA Astrophysics Data System (ADS)
Giordano, S.; Le Bris, A.; Mallet, C.
2018-05-01
Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.
36 CFR 1220.34 - What must an agency do to carry out its records management responsibilities?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Integrate records management and archival requirements into the design, development, and implementation of... carry out its records management responsibilities? 1220.34 Section 1220.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL Agency...
36 CFR 1220.34 - What must an agency do to carry out its records management responsibilities?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Integrate records management and archival requirements into the design, development, and implementation of... carry out its records management responsibilities? 1220.34 Section 1220.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL Agency...
36 CFR 1220.34 - What must an agency do to carry out its records management responsibilities?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Integrate records management and archival requirements into the design, development, and implementation of... carry out its records management responsibilities? 1220.34 Section 1220.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL Agency...
36 CFR 1220.34 - What must an agency do to carry out its records management responsibilities?
Code of Federal Regulations, 2011 CFR
2011-07-01
...) Integrate records management and archival requirements into the design, development, and implementation of... carry out its records management responsibilities? 1220.34 Section 1220.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL Agency...
NCO Production Management Branch
Climate Climate Prediction Climate Archives Weather Safety Storm Ready NOAA Central Library Photo Library Management Branch Production Management Branch About the Production Management Branch NCO REQUEST FOR CHANGE (RFC) DATABASE ACCESS NCO Request For Change (RFC) Archive [For INTERNAL Use Only] NCO Request For
A Photo Album of Earth Scheduling Landsat 7 Mission Daily Activities
NASA Technical Reports Server (NTRS)
Potter, William; Gasch, John; Bauer, Cynthia
1998-01-01
Landsat7 is a member of a new generation of Earth observation satellites. Landsat7 will carry on the mission of the aging Landsat 5 spacecraft by acquiring high resolution, multi-spectral images of the Earth surface for strategic, environmental, commercial, agricultural and civil analysis and research. One of the primary mission goals of Landsat7 is to accumulate and seasonally refresh an archive of global images with full coverage of Earth's landmass, less the central portion of Antarctica. This archive will enable further research into seasonal, annual and long-range trending analysis in such diverse research areas as crop yields, deforestation, population growth, and pollution control, to name just a few. A secondary goal of Landsat7 is to fulfill imaging requests from our international partners in the mission. Landsat7 will transmit raw image data from the spacecraft to 25 ground stations in 20 subscribing countries. Whereas earlier Landsat missions were scheduled manually (as are the majority of current low-orbit satellite missions), the task of manually planning and scheduling Landsat7 mission activities would be overwhelmingly complex when considering the large volume of image requests, the limited resources available, spacecraft instrument limitations, and the limited ground image processing capacity, not to mention avoidance of foul weather systems. The Landsat7 Mission Operation Center (MOC) includes an image scheduler subsystem that is designed to automate the majority of mission planning and scheduling, including selection of the images to be acquired, managing the recording and playback of the images by the spacecraft, scheduling ground station contacts for downlink of images, and generating the spacecraft commands for controlling the imager, recorder, transmitters and antennas. The image scheduler subsystem autonomously generates 90% of the spacecraft commanding with minimal manual intervention. The image scheduler produces a conflict-free schedule for acquiring images of the "best" 250 scenes daily for refreshing the global archive. It then equitably distributes the remaining resources for acquiring up to 430 scenes to satisfy requests by international subscribers. The image scheduler selects candidate scenes based on priority and age of the requests, and predicted cloud cover and sun angle at each scene. It also selects these scenes to avoid instrument constraint violations and maximizes efficiency of resource usage by encouraging acquisition of scenes in clusters. Of particular interest to the mission planners, it produces the resulting schedule in a reasonable time, typically within 15 minutes.
Automated extraction of metadata from remotely sensed satellite imagery
NASA Technical Reports Server (NTRS)
Cromp, Robert F.
1991-01-01
The paper discusses research in the Intelligent Data Management project at the NASA/Goddard Space Flight Center, with emphasis on recent improvements in low-level feature detection algorithms for performing real-time characterization of images. Images, including MSS and TM data, are characterized using neural networks and the interpretation of the neural network output by an expert system for subsequent archiving in an object-oriented data base. The data show the applicability of this approach to different arrangements of low-level remote sensing channels. The technique works well when the neural network is trained on data similar to the data used for testing.
Landsat: A Global Land-Imaging Project
Headley, Rachel
2010-01-01
Across nearly four decades since 1972, Landsat satellites continuously have acquired space-based images of the Earth's land surface, coastal shallows, and coral reefs. The Landsat Program, a joint effort of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA), was established to routinely gather land imagery from space; consequently, NASA develops remote-sensing instruments and spacecraft, then launches and validates the satellites. The USGS then assumes ownership and operation of the satellites, in addition to managing all ground-data reception, archiving, product generation, and distribution. The result of this program is a visible, long-term record of natural and human-induced changes on the global landscape.
Landsat: a global land imaging program
Byrnes, Raymond A.
2012-01-01
Landsat satellites have continuously acquired space-based images of the Earth's land surface, coastal shallows, and coral reefs across four decades. The Landsat Program, a joint effort of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA), was established to routinely gather land imagery from space. In practice, NASA develops remote-sensing instruments and spacecraft, launches satellites, and validates their performance. The USGS then assumes ownership and operation of the satellites, in addition to managing all ground-data reception, archiving, product generation, and distribution. The result of this program is a visible, long-term record of natural and human-induced changes on the global landscape.
Archive & Data Management Activities for ISRO Science Archives
NASA Astrophysics Data System (ADS)
Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.
2012-07-01
ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.
Fermilab History and Archives Project | Golden Books - The Early History of
Fermilab History and Archives Project Home About the Archives History and Archives Online Request Contact ; - The Early History of URA and Fermilab Fermilab Golden Book Collection main page Click on Image for Larger View The Early History of URA and Fermilab Viewpoint of a URA President (1966-1981) Norman F
The imaging node for the Planetary Data System
Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.
1996-01-01
The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.
Commission 5: Documentation and Astronomical Data
NASA Astrophysics Data System (ADS)
Norris, Raymond P.; Ohishi, Masatoshi; Genova, Françoise; Grothkopf, Uta; Malkov, Oleg Yu.; Pence, William D.; Schmitz, Marion; Hanisch, Robert J.; Zhou, Xu
IAU Commission 5 deals with data management issues, and its working groups and task groups deal specifically with information handling, with data centres and networks, with technical aspects of collection, archiving, storage and dissemination of data, with designations and classification of astronomical objects, with library services, editorial policies, computer communications, ad hoc methodologies, and with various standards, reference frames, etc., FITS, astronomys Flexible Image Transport System, the major data exchange format, is controlled, maintained and updated by the Working Group FITS.
Radiometric calibration updates to the Landsat collection
Micijevic, Esad; Haque, Md. Obaidul; Mishra, Nischal
2016-01-01
The Landsat Project is planning to implement a new collection management strategy for Landsat products generated at the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center. The goal of the initiative is to identify a collection of consistently geolocated and radiometrically calibrated images across the entire Landsat archive that is readily suitable for time-series analyses. In order to perform an accurate land change analysis, the data from all Landsat sensors must be on the same radiometric scale. Landsat 7 Enhanced Thematic Mapper Plus (ETM+) is calibrated to a radiance standard and all previous sensors are cross-calibrated to its radiometric scale. Landsat 8 Operational Land Imager (OLI) is calibrated to both radiance and reflectance standards independently. The Landsat 8 OLI reflectance calibration is considered to be most accurate. To improve radiometric calibration accuracy of historical data, Landsat 1-7 sensors also need to be cross-calibrated to the OLI reflectance scale. Results of that effort, as well as other calibration updates including the absolute and relative radiometric calibration and saturated pixel replacement for Landsat 8 OLI and absolute calibration for Landsat 4 and 5 Thematic Mappers (TM), will be implemented into Landsat products during the archive reprocessing campaign planned within the new collection management strategy. This paper reports on the planned radiometric calibration updates to the solar reflective bands of the new Landsat collection.
Fault tolerance techniques to assure data integrity in high-volume PACS image archives
NASA Astrophysics Data System (ADS)
He, Yutao; Huang, Lu J.; Valentino, Daniel J.; Wingate, W. Keith; Avizienis, Algirdas
1995-05-01
Picture archiving and communication systems (PACS) perform the systematic acquisition, archiving, and presentation of large quantities of radiological image and text data. In the UCLA Radiology PACS, for example, the volume of image data archived currently exceeds 2500 gigabytes. Furthermore, the distributed heterogeneous PACS is expected to have near real-time response, be continuously available, and assure the integrity and privacy of patient data. The off-the-shelf subsystems that compose the current PACS cannot meet these expectations; therefore fault tolerance techniques had to be incorporated into the system. This paper is to report our first-step efforts towards the goal and is organized as follows: First we discuss data integrity and identify fault classes under the PACS operational environment, then we describe auditing and accounting schemes developed for error-detection and analyze operational data collected. Finally, we outline plans for future research.
A pathologist-designed imaging system for anatomic pathology signout, teaching, and research.
Schubert, E; Gross, W; Siderits, R H; Deckenbaugh, L; He, F; Becich, M J
1994-11-01
Pathology images are derived from gross surgical specimens, light microscopy, immunofluorescence, electron microscopy, molecular diagnostic gels, flow cytometry, image analysis data, and clinical laboratory data in graphic form. We have implemented a network of desktop personal computers (PCs) that allow us to easily capture, store, and retrieve gross and microscopic, anatomic, and research pathology images. System architecture involves multiple image acquisition and retrieval sites and a central file server for storage. The digitized images are conveyed via a local area network to and from image capture or display stations. Acquisition sites consist of a high-resolution camera connected to a frame grabber card in a 486-type personal computer, equipped with 16 MB (Table 1) RAM, a 1.05-gigabyte hard drive, and a 32-bit ethernet card for access to our anatomic pathology reporting system. We have designed a push-button workstation for acquiring and indexing images that does not significantly interfere with surgical pathology sign-out. Advantages of the system include the following: (1) Improving patient care: the availability of gross images at time of microscopic sign-out, verification of recurrence of malignancy from archived images, monitoring of bone marrow engraftment and immunosuppressive intervention after bone marrow/solid organ transplantation on repeat biopsies, and ability to seek instantaneous consultation with any pathologist on the network; (2) enhancing the teaching environment: building a digital surgical pathology atlas, improving the availability of images for conference support, and sharing cases across the network; (3) enhancing research: case study compilation, metastudy analysis, and availability of digitized images for quantitative analysis and permanent/reusable image records for archival study; and (4) other practical and economic considerations: storing case requisition images and hand-drawn diagrams deters the spread of gross room contaminants and results in considerable cost savings in photographic media for conferences, improved quality assurance by porting control stains across the network, and a multiplicity of other advantages that enhance image and information management in pathology.
Operational environmental satellite archives in the 21st Century
NASA Astrophysics Data System (ADS)
Barkstrom, Bruce R.; Bates, John J.; Privette, Jeff; Vizbulis, Rick
2007-09-01
NASA, NOAA, and USGS collections of Earth science data are large, federated, and have active user communities and collections. Our experience raises five categories of issues for long-term archival: *Organization of the data in the collections is not well-described by text-based categorization principles *Metadata organization for these data is not well-described by Dublin Core and needs attention to data access and data use patterns *Long-term archival requires risk management approaches to dealing with the unique threats to knowledge preservation specific to digital information *Long-term archival requires careful attention to archival cost management *Professional data stewards for these collections may require special training. This paper suggests three mechanisms for improving the quality of long-term archival: *Using a maturity model to assess the readiness of data for accession, for preservation, and for future data usefulness *Developing a risk management strategy for systematically dealing with threats of data loss *Developing a life-cycle cost model for continuously evolving the collections and the data centers that house them.
36 CFR § 1220.34 - What must an agency do to carry out its records management responsibilities?
Code of Federal Regulations, 2013 CFR
2013-07-01
..., systems, and procedures; (e) Integrate records management and archival requirements into the design... carry out its records management responsibilities? § 1220.34 Section § 1220.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT FEDERAL RECORDS; GENERAL...
NASA Astrophysics Data System (ADS)
Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César
2014-11-01
We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.
The GIK-Archive of sediment core radiographs with documentation
NASA Astrophysics Data System (ADS)
Grobe, Hannes; Winn, Kyaw; Werner, Friedrich; Driemel, Amelie; Schumacher, Stefanie; Sieger, Rainer
2017-12-01
The GIK-Archive of radiographs is a collection of X-ray negative and photographic images of sediment cores based on exposures taken since the early 1960s. During four decades of marine geological work at the University of Kiel, Germany, several thousand hours of sampling, careful preparation and X-raying were spent on producing a unique archive of sediment radiographs from several parts of the World Ocean. The archive consists of more than 18 500 exposures on chemical film that were digitized, geo-referenced, supplemented with metadata and archived in the data library PANGAEA®. With this publication, the images have become available open-access for use by the scientific community at https://doi.org/10.1594/PANGAEA.854841.
Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.
2006-01-01
In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Automated extraction of radiation dose information for CT examinations.
Cook, Tessa S; Zimmerman, Stefan; Maidment, Andrew D A; Kim, Woojin; Boonn, William W
2010-11-01
Exposure to radiation as a result of medical imaging is currently in the spotlight, receiving attention from Congress as well as the lay press. Although scanner manufacturers are moving toward including effective dose information in the Digital Imaging and Communications in Medicine headers of imaging studies, there is a vast repository of retrospective CT data at every imaging center that stores dose information in an image-based dose sheet. As such, it is difficult for imaging centers to participate in the ACR's Dose Index Registry. The authors have designed an automated extraction system to query their PACS archive and parse CT examinations to extract the dose information stored in each dose sheet. First, an open-source optical character recognition program processes each dose sheet and converts the information to American Standard Code for Information Interchange (ASCII) text. Each text file is parsed, and radiation dose information is extracted and stored in a database which can be queried using an existing pathology and radiology enterprise search tool. Using this automated extraction pipeline, it is possible to perform dose analysis on the >800,000 CT examinations in the PACS archive and generate dose reports for all of these patients. It is also possible to more effectively educate technologists, radiologists, and referring physicians about exposure to radiation from CT by generating report cards for interpreted and performed studies. The automated extraction pipeline enables compliance with the ACR's reporting guidelines and greater awareness of radiation dose to patients, thus resulting in improved patient care and management. Copyright © 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart
1998-07-01
The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.
NASA Technical Reports Server (NTRS)
Green, James L.
1989-01-01
The National Space Science Data Center (NSSDC), established in 1966, is the largest archive for processed data from NASA's space and Earth science missions. The NSSDC manages over 120,000 data tapes with over 4,000 data sets. The size of the digital archive is approximately 6,000 gigabytes with all of this data in its original uncompressed form. By 1995 the NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC is beginning several thrusts allowing it to better serve the scientific community and keep up with managing the ever increasing volumes of data. These thrusts involve managing larger and larger amounts of information and data online, employing mass storage techniques, and the use of low rate communications networks to move requested data to remote sites in the United States, Europe and Canada. The success of these thrusts, combined with the tremendous volume of data expected to be archived at the NSSDC, clearly indicates that innovative storage and data management solutions must be sought and implemented. Although not presently used, data compression techniques may be a very important tool for managing a large fraction or all of the NSSDC archive in the future. Some future applications would consist of compressing online data in order to have more data readily available, compress requested data that must be moved over low rate ground networks, and compress all the digital data in the NSSDC archive for a cost effective backup that would be used only in the event of a disaster.
Intrahospital teleradiology from the emergency room
NASA Astrophysics Data System (ADS)
Fuhrman, Carl R.; Slasky, B. S.; Gur, David; Lattner, Stefanie; Herron, John M.; Plunkett, Michael B.; Towers, Jeffrey D.; Thaete, F. Leland
1993-09-01
Off-hour operations of the modern emergency room presents a challenge to conventional image management systems. To assess the utility of intrahospital teleradiology systems from the emergency room (ER), we installed a high-resolution film digitizer which was interfaced to a central archive and to a workstation at the main reading room. The system was designed to allow for digitization of images as soon as the films were processed. Digitized images were autorouted to both destinations, and digitized images could be laser printed (if desired). Almost real time interpretations of nonselected cases were performed at both locations (conventional film in the ER and a workstation in the main reading room), and an analysis of disagreements was performed. Our results demonstrate that in spite of a `significant' difference in reporting, `clinically significant differences' were found in less than 5% of cases. Folder management issues, preprocessing, image orientation, and setting reasonable lookup tables for display were identified as the main limitations to the systems' routine use in a busy environment. The main limitation of the conventional film was the identification of subtle abnormalities in the bright regions of the film. Once identified on either system (conventional film or soft display), all abnormalities were visible and detectable on both display modalities.
Lee, Jasper; Zhang, Jianguo; Park, Ryan; Dagliyan, Grant; Liu, Brent; Huang, H K
2012-07-01
A Molecular Imaging Data Grid (MIDG) was developed to address current informatics challenges in archival, sharing, search, and distribution of preclinical imaging studies between animal imaging facilities and investigator sites. This manuscript presents a 2nd generation MIDG replacing the Globus Toolkit with a new system architecture that implements the IHE XDS-i integration profile. Implementation and evaluation were conducted using a 3-site interdisciplinary test-bed at the University of Southern California. The 2nd generation MIDG design architecture replaces the initial design's Globus Toolkit with dedicated web services and XML-based messaging for dedicated management and delivery of multi-modality DICOM imaging datasets. The Cross-enterprise Document Sharing for Imaging (XDS-i) integration profile from the field of enterprise radiology informatics was adopted into the MIDG design because streamlined image registration, management, and distribution dataflow are likewise needed in preclinical imaging informatics systems as in enterprise PACS application. Implementation of the MIDG is demonstrated at the University of Southern California Molecular Imaging Center (MIC) and two other sites with specified hardware, software, and network bandwidth. Evaluation of the MIDG involves data upload, download, and fault-tolerance testing scenarios using multi-modality animal imaging datasets collected at the USC Molecular Imaging Center. The upload, download, and fault-tolerance tests of the MIDG were performed multiple times using 12 collected animal study datasets. Upload and download times demonstrated reproducibility and improved real-world performance. Fault-tolerance tests showed that automated failover between Grid Node Servers has minimal impact on normal download times. Building upon the 1st generation concepts and experiences, the 2nd generation MIDG system improves accessibility of disparate animal-model molecular imaging datasets to users outside a molecular imaging facility's LAN using a new architecture, dataflow, and dedicated DICOM-based management web services. Productivity and efficiency of preclinical research for translational sciences investigators has been further streamlined for multi-center study data registration, management, and distribution.
MODIS. Volume 1: MODIS level 1A software baseline requirements
NASA Technical Reports Server (NTRS)
Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James
1994-01-01
This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.
Web Archiving for the Rest of Us: How to Collect and Manage Websites Using Free and Easy Software
ERIC Educational Resources Information Center
Dunn, Katharine; Szydlowski, Nick
2009-01-01
Large-scale projects such as the Internet Archive (www.archive.org) send out crawlers to gather snapshots of much of the web. This massive collection of archived websites may include content of interest to one's patrons. But if librarians want to control exactly when and what is archived, relying on someone else to do the archiving is not ideal.…
Digital echocardiography 2002: now is the time
NASA Technical Reports Server (NTRS)
Thomas, James D.; Greenberg, Neil L.; Garcia, Mario J.
2002-01-01
The ability to acquire echocardiographic images digitally, store and transfer these data using the DICOM standard, and routinely analyze examinations exists today and allows the implementation of a digital echocardiography laboratory. The purpose of this review article is to outline the critical components of a digital echocardiography laboratory, discuss general strategies for implementation, and put forth some of the pitfalls that we have encountered in our own implementation. The major components of the digital laboratory include (1) digital echocardiography machines with network output, (2) a switched high-speed network, (3) a high throughput server with abundant local storage, (4) a reliable low-cost archive, (5) software to manage information, and (6) support mechanisms for software and hardware. Implementation strategies can vary from a complete vendor solution providing all components (hardware, software, support), to a strategy similar to our own where standard computer and networking hardware are used with specialized software for management of image and measurement information.
Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis.
Gao, Yurui; Burns, Scott S; Lauzon, Carolyn B; Fong, Andrew E; James, Terry A; Lubar, Joel F; Thatcher, Robert W; Twillie, David A; Wirt, Michael D; Zola, Marc A; Logan, Bret W; Anderson, Adam W; Landman, Bennett A
2013-03-29
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.
Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis
NASA Astrophysics Data System (ADS)
Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.
2013-03-01
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.
Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis
Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.
2013-01-01
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software. PMID:24386548
Landsat View: Pearl River Delta, China
2017-12-08
In 1979, China established two special economic zones around the Pearl River Delta, north of Hong Kong. This image, taken by Landsat 3 on October 19, 1973, shows that the region was rural when the zone was established. Plant-covered land, which is red in this false-color image, dominates the scene. Square grids are agriculture. By January 10, 2003, when Landsat 7 took this image, the Pearl River Delta was a densely populated urban corridor with several large cities. The urban areas are gray in this image. The region is a major manufacturing center with an economy the size of Taiwan’s. As of 2010, the Pearl River Economic Zone had a population of 36 million people. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Dlesk, A.; Raeva, P.; Vach, K.
2018-05-01
Processing of analog photogrammetric negatives using current methods brings new challenges and possibilities, for example, creation of a 3D model from archival images which enables the comparison of historical state and current state of cultural heritage objects. The main purpose of this paper is to present possibilities of processing archival analog images captured by photogrammetric camera Rollei 6006 metric. In 1994, the Czech company EuroGV s.r.o. carried out photogrammetric measurements of former limestone quarry the Great America located in the Central Bohemian Region in the Czech Republic. All the negatives of photogrammetric images, complete documentation, coordinates of geodetically measured ground control points, calibration reports and external orientation of images calculated in the Combined Adjustment Program are preserved and were available for the current processing. Negatives of images were scanned and processed using structure from motion method (SfM). The result of the research is a statement of what accuracy is possible to expect from the proposed methodology using Rollei metric images originally obtained for terrestrial intersection photogrammetry while adhering to the proposed methodology.
Diagnostic report acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Brooks, Everett G.; Rothman, Melvyn L.
1991-07-01
The Mayo Clinic and IBM Rochester have jointly developed a picture archive and control system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. One of the challenges of developing a useful PACS involves integrating the diagnostic reports with the electronic images so they can be displayed simultaneously. By the time a diagnostic report is generated for a particular case, its images have already been captured and archived by the PACS. To integrate the report with the images, the authors have developed an IBM Personal System/2 computer (PS/2) based diagnostic report acquisition unit (RAU). A typed copy of the report is transmitted via facsimile to the RAU where it is stacked electronically with other reports that have been sent previously but not yet processed. By processing these reports at the RAU, the information they contain is integrated with the image database and a copy of the report is archived electronically on an IBM Application System/400 computer (AS/400). When a user requests a set of images for viewing, the report is automatically integrated with the image data. By using a hot key, the user can toggle on/off the report on the display screen. This report describes process, hardware, and software employed to integrate the diagnostic report information into the PACS, including how the report images are captured, transmitted, and entered into the AS/400 database. Also described is how the archived reports and their associated medical images are located and merged for retrieval and display. The methods used to detect and process error conditions are also discussed.
Visual analytics for semantic queries of TerraSAR-X image content
NASA Astrophysics Data System (ADS)
Espinoza-Molina, Daniela; Alonso, Kevin; Datcu, Mihai
2015-10-01
With the continuous image product acquisition of satellite missions, the size of the image archives is considerably increasing every day as well as the variety and complexity of their content, surpassing the end-user capacity to analyse and exploit them. Advances in the image retrieval field have contributed to the development of tools for interactive exploration and extraction of the images from huge archives using different parameters like metadata, key-words, and basic image descriptors. Even though we count on more powerful tools for automated image retrieval and data analysis, we still face the problem of understanding and analyzing the results. Thus, a systematic computational analysis of these results is required in order to provide to the end-user a summary of the archive content in comprehensible terms. In this context, visual analytics combines automated analysis with interactive visualizations analysis techniques for an effective understanding, reasoning and decision making on the basis of very large and complex datasets. Moreover, currently several researches are focused on associating the content of the images with semantic definitions for describing the data in a format to be easily understood by the end-user. In this paper, we present our approach for computing visual analytics and semantically querying the TerraSAR-X archive. Our approach is mainly composed of four steps: 1) the generation of a data model that explains the information contained in a TerraSAR-X product. The model is formed by primitive descriptors and metadata entries, 2) the storage of this model in a database system, 3) the semantic definition of the image content based on machine learning algorithms and relevance feedback, and 4) querying the image archive using semantic descriptors as query parameters and computing the statistical analysis of the query results. The experimental results shows that with the help of visual analytics and semantic definitions we are able to explain the image content using semantic terms and the relations between them answering questions such as what is the percentage of urban area in a region? or what is the distribution of water bodies in a city?
Use of archive aerial photography for monitoring black mangrove populations
USDA-ARS?s Scientific Manuscript database
A study was conducted on the south Texas Gulf Coast to evaluate archive aerial color-infrared (CIR) photography combined with supervised image analysis techniques to quantify changes in black mangrove [Avicennia germinans (L.) L.] populations over a 26-year period. Archive CIR film from two study si...
The Archival Photograph and Its Meaning: Formalisms for Modeling Images
ERIC Educational Resources Information Center
Benson, Allen C.
2009-01-01
This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…
NASA Astrophysics Data System (ADS)
Lyons, M. B.; Phinn, S. R.; Roelfsema, C. M.
2011-12-01
Long term global archives of high-moderate spatial resolution, multi-spectral satellite imagery are now readily accessible, but are not being fully utilised by management agencies due to the lack of appropriate methods to consistently produce accurate and timely management ready information. This work developed an object-based approach to map land cover and seagrass distribution in an Australian coastal environment for a 38 year Landsat image time-series archive. Landsat Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM+) imagery were used without in-situ field data input to produce land and seagrass cover maps every year data was available, resulting in over 60 individual map products over the 38 year archive. Land cover was mapped annually and included several vegetation, bare ground, urban and agricultural classes. Seagrass distribution was also mapped annually, and in some years monthly, via horizontal projective foliage cover classes, sand and deepwater. Land cover products were validated using aerial photography and seagrass was validated with field survey data, producing several measures of accuracy. An average overall accuracy of 65% and 81% was reported for seagrass and land cover respectively, which is consistent with other studies in the area. This study is the first to show moderate spatial resolution, long term annual changes in land cover and seagrass in an Australian environment, without the use of in-situ data; and only one of a few similar studies globally. The land cover products identify several long term trends; such as significant increases in South East Queensland's urban density, vegetation clearing in rural and rural-residential areas, and inter-annual variation in dry vegetation types in western South East Queensland. The seagrass cover products show that there has been a minimal overall change in seagrass extent, but that seagrass cover level distribution is extremely dynamic; evidenced by large scale migrations of higher seagrass cover levels and several events of sudden, significant changes in cover level. These mapping products will allow management agencies to build a baseline assessment of their resources, understand past changes and help inform implementation and planning of management policy to address potential future changes.
NASA Astrophysics Data System (ADS)
Lyons, Mitchell B.; Phinn, Stuart R.; Roelfsema, Chris M.
2012-07-01
Long term global archives of high-moderate spatial resolution, multi-spectral satellite imagery are now readily accessible, but are not being fully utilised by management agencies due to the lack of appropriate methods to consistently produce accurate and timely management ready information. This work developed an object-based remote sensing approach to map land cover and seagrass distribution in an Australian coastal environment for a 38 year Landsat image time-series archive (1972-2010). Landsat Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM+) imagery were used without in situ field data input (but still using field knowledge) to produce land and seagrass cover maps every year data were available, resulting in over 60 map products over the 38 year archive. Land cover was mapped annually using vegetation, bare ground, urban and agricultural classes. Seagrass distribution was also mapped annually, and in some years monthly, via horizontal projected foliage cover classes, sand and deep water. Land cover products were validated using aerial photography and seagrass maps were validated with field survey data, producing several measures of accuracy. An average overall accuracy of 65% and 80% was reported for seagrass and land cover products respectively, which is consistent with other studies in the area. This study is the first to show moderate spatial resolution, long term annual changes in land cover and seagrass in an Australian environment, created without the use of in situ data; and only one of a few similar studies globally. The land cover products identify several long term trends; such as significant increases in South East Queensland's urban density and extent, vegetation clearing in rural and rural-residential areas, and inter-annual variation in dry vegetation types in western South East Queensland. The seagrass cover products show that there has been a minimal overall change in seagrass extent, but that seagrass cover level distribution is extremely dynamic; evidenced by large scale migrations of higher seagrass cover levels and several sudden and significant changes in cover level. These mapping products will allow management agencies to build a baseline assessment of their resources, understand past changes and help inform implementation and planning of management policy to address potential future changes.
Performance of the Mayo-IBM PAC system
NASA Astrophysics Data System (ADS)
Persons, Kenneth R.; Reardon, Frank J.; Gehring, Dale G.; Hangiandreou, Nicholas J.
1994-05-01
The Mayo Clinic and IBM (at Rochester, Minnesota) have jointly developed a picture archived system for use with Mayo's MRI and CT imaging modalities. This PACS is made up of over 50 computers that work cooperatively to provide archival, retrieval and image distribution services for Mayo's Department of Radiology. This paper will examine the performance characteristics of the system.
Landsat: A global land-imaging mission
,
2012-01-01
Across four decades since 1972, Landsat satellites have continuously acquired space-based images of the Earth's land surface, coastal shallows, and coral reefs. The Landsat Program, a joint effort of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA), was established to routinely gather land imagery from space. NASA develops remote-sensing instruments and spacecraft, then launches and validates the performance of the instruments and satellites. The USGS then assumes ownership and operation of the satellites, in addition to managing all ground reception, data archiving, product generation, and distribution. The result of this program is a long-term record of natural and human induced changes on the global landscape.
How to make deposition of images a reality
Guss, J. Mitchell; McMahon, Brian
2014-01-01
The IUCr Diffraction Data Deposition Working Group is investigating the rationale and policies for routine deposition of diffraction images (and other primary experimental data sets). An information-management framework is described that should inform policy directions, and some of the technical and other issues that need to be addressed in an effort to achieve such a goal are analysed. In the near future, routine data deposition could be encouraged at one of the growing number of institutional repositories that accept data sets or at a generic data-publishing web repository service. To realise all of the potential benefits of depositing diffraction data, specialized archives would be preferable. Funding such an initiative will be challenging. PMID:25286838
Social Science Data Archives and Libraries: A View to the Future.
ERIC Educational Resources Information Center
Clark, Barton M.
1982-01-01
Discusses factors militating against integration of social science data archives and libraries in near future, noting usage of materials, access requisite skills of librarians, economic stability of archives, existing structures which manage social science data archives. Role of librarians, data access tools, and cataloging of machine-readable…
Technologically Enhanced Archival Collections: Using the Buddy System
ERIC Educational Resources Information Center
Holz, Dayna
2006-01-01
Based in the context of challenges faced by archives when managing digital projects, this article explores options of looking outside the existing expertise of archives staff to find collaborative partners. In teaming up with other departments and organizations, the potential scope of traditional archival digitization projects is expanded beyond…
Resources for Archives: Developing Collections, Constituents, Colleagues, and Capital
ERIC Educational Resources Information Center
Primer, Ben
2009-01-01
The essential element for archival success is to be found in the quality of management decisions made and public services provided. Archivists can develop first-class archives operations through understanding the organizational context; planning; hiring, retaining, and developing staff; meeting archival standards for storage and access; and…
Kajimura, Junko; Ito, Reiko; Manley, Nancy R; Hale, Laura P
2016-02-01
Performance of immunofluorescence staining on archival formalin-fixed paraffin-embedded human tissues is generally not considered to be feasible, primarily due to problems with tissue quality and autofluorescence. We report the development and application of procedures that allowed for the study of a unique archive of thymus tissues derived from autopsies of individuals exposed to atomic bomb radiation in Hiroshima, Japan in 1945. Multiple independent treatments were used to minimize autofluorescence and maximize fluorescent antibody signals. Treatments with NH3/EtOH and Sudan Black B were particularly useful in decreasing autofluorescent moieties present in the tissue. Deconvolution microscopy was used to further enhance the signal-to-noise ratios. Together, these techniques provide high-quality single- and dual-color fluorescent images with low background and high contrast from paraffin blocks of thymus tissue that were prepared up to 60 years ago. The resulting high-quality images allow the application of a variety of image analyses to thymus tissues that previously were not accessible. Whereas the procedures presented remain to be tested for other tissue types and archival conditions, the approach described may facilitate greater utilization of older paraffin block archives for modern immunofluorescence studies. © 2016 The Histochemical Society.
Challenges of archiving science data from long duration missions: the Rosetta case
NASA Astrophysics Data System (ADS)
Heather, David
2016-07-01
Rosetta is the first mission designed to orbit and land on a comet. It consists of an orbiter, carrying 11 science experiments, and a lander, called 'Philae', carrying 10 additional instruments. Rosetta was launched on 2 March 2004, and arrived at the comet 67P/Churyumov-Gerasimenko on 6 August 2014. During its long journey, Rosetta has completed flybys of the Earth and Mars, and made two excursions to the main asteroid belt to observe (2867) Steins and (21) Lutetia. On 12 November 2014, the Philae probe soft landed on comet 67P/Churyumov-Gerasimenko, the first time in history that such an extraordinary feat has been achieved. After the landing, the Rosetta orbiter followed the comet through its perihelion in August 2015, and will continue to accompany 67P/Churyumov-Gerasimenko as it recedes from the Sun until the end of the mission. There are significant challenges in managing the science archive of a mission such as Rosetta. The first data were returned from Rosetta more than 10 years ago, and there have been flybys of several planetary bodies, including two asteroids from which significant science data were returned by many of the instruments. The scientific applications for these flyby data can be very different to those taken during the main science phase at the comet, but there are severe limitations on the changes that can be applied to the data pipelines managed by the various science teams as resources are scarce. The priority is clearly on maximising the potential science from the comet phase, so data formats and pipelines have been designed with that in mind, and changes limited to managing issues found during official archiving authority and independent science reviews. In addition, in the time that Rosetta has been operating, the archiving standards themselves have evolved. All Rosetta data are archived following version 3 of NASA's Planetary Data System (PDS) Standards. Currently, new and upcoming planetary science missions are delivering data following the new 'PDS4' standards, which are using a very different format and require significant changes to the archive itself to manage. There are no plans at ESA to convert the data to PDS4 formats, but the community may need this to be completed in the long term if we are to realise the full scientific potential of the mission. There is a Memorandum of Understanding between ESA and NASA that commits to there being a full copy of the Rosetta science data holdings both within the Planetary Science Archive (PSA) at ESA and with NASA's Planetary Data System, at the Small Bodies Node (SBN) in Maryland. The requirements from each archiving authority place sometimes contradictory restrictions on the formatting and structure of the data content, and there has also been a significant evolution of the archives on both side of the Atlantic. The SBN have themselves expressed a desire to 'convert' the Rosetta data to PDS4 formats, so this will need to be carefully managed between the archiving authorities to ensure consistency in the Rosetta archive overall. Validation of the returned data to ensure full compliance with both the PSA and the PDS archives has required the development of a specific tool (DVal) that can be configured to manage the specificities of each instrument team's science data. Unlike the PDS, which comprises an affiliation of 'nodes', each specialising in a planetary science discipline, the PSA is a single archive designed to host data from all of ESA's planetary science missions. There have been significant challenges in evolving the archive to meet Rosetta's needs as a long-term project, without compromising the service provided to the other ongoing missions. Partly in response to this, the PSA is currently implementing a number of significant changes, both to its web-based interface to the scientific community, and to its database structure. The newly designed PSA will aim to provide easier and more direct access to the Rosetta data (and all of ESA's planetary science data holdings), and will help to soften the impact of some of the issues that have arisen with managing missions such as Rosetta in the existing framework. Conclusions: Development and management of the Rosetta science archive has been a significant challenge, due in part to the long duration of the mission and the corresponding need for development of the archive infrastructure and of the archiving process to manage these changes. The definition of a single set of conventions to manage the diverse suite of instruments, targets and indeed archiving authorities on Rosetta over this time has been a major issue, as has the need to evolve the validation processes that allow the data to be fully ingested and released to the community. This presentation will discuss the many issues faced by the PSA in the archiving of data from Rosetta, and the approach taken to resolve them. Lessons learned will be presented along with recommendations for other archiving authorities who will in future have the need to design and operate a science archive for long duration and international missions.
Determining the Completeness of the Nimbus Meteorological Data Archive
NASA Technical Reports Server (NTRS)
Johnson, James; Moses, John; Kempler, Steven; Zamkoff, Emily; Al-Jazrawi, Atheer; Gerasimov, Irina; Trivedi, Bhagirath
2011-01-01
NASA launched the Nimbus series of meteorological satellites in the 1960s and 70s. These satellites carried instruments for making observations of the Earth in the visible, infrared, ultraviolet, and microwave wavelengths. The original data archive consisted of a combination of digital data written to 7-track computer tapes and on various film media. Many of these data sets are now being migrated from the old media to the GES DISC modern online archive. The process involves recovering the digital data files from tape as well as scanning images of the data from film strips. Some of the challenges of archiving the Nimbus data include the lack of any metadata from these old data sets. Metadata standards and self-describing data files did not exist at that time, and files were written on now obsolete hardware systems and outdated file formats. This requires creating metadata by reading the contents of the old data files. Some digital data files were corrupted over time, or were possibly improperly copied at the time of creation. Thus there are data gaps in the collections. The film strips were stored in boxes and are now being scanned as JPEG-2000 images. The only information describing these images is what was written on them when they were originally created, and sometimes this information is incomplete or missing. We have the ability to cross-reference the scanned images against the digital data files to determine which of these best represents the data set from the various missions, or to see how complete the data sets are. In this presentation we compared data files and scanned images from the Nimbus-2 High-Resolution Infrared Radiometer (HRIR) for September 1966 to determine whether the data and images are properly archived with correct metadata.
User Driven Image Stacking for ODI Data and Beyond via a Highly Customizable Web Interface
NASA Astrophysics Data System (ADS)
Hayashi, S.; Gopu, A.; Young, M. D.; Kotulla, R.
2015-09-01
While some astronomical archives have begun serving standard calibrated data products, the process of producing stacked images remains a challenge left to the end-user. The benefits of astronomical image stacking are well established, and dither patterns are recommended for almost all observing targets. Some archives automatically produce stacks of limited scientific usefulness without any fine-grained user or operator configurability. In this paper, we present PPA Stack, a web based stacking framework within the ODI - Portal, Pipeline, and Archive system. PPA Stack offers a web user interface with built-in heuristics (based on pointing, filter, and other metadata information) to pre-sort images into a set of likely stacks while still allowing the user or operator complete control over the images and parameters for each of the stacks they wish to produce. The user interface, designed using AngularJS, provides multiple views of the input dataset and parameters, all of which are synchronized in real time. A backend consisting of a Python application optimized for ODI data, wrapped around the SWarp software, handles the execution of stacking workflow jobs on Indiana University's Big Red II supercomputer, and the subsequent ingestion of the combined images back into the PPA archive. PPA Stack is designed to enable seamless integration of other stacking applications in the future, so users can select the most appropriate option for their science.
Restoration and PDS Archive of Apollo Lunar Rock Sample Data
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Todd, N. S.; Lofgren, G. E.; Stefanov, W. L.; Runco, S. K.; LaBasse, D.; Gaddis, L. R.
2011-01-01
In 2008, scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory and Image Science & Analysis Laboratory (under the auspices of the Astromaterials Research and Exploration Science Directorate or ARES) began work on a 4-year project to digitize the original film negatives of Apollo Lunar Rock Sample photographs. These rock samples together with lunar regolith and core samples were collected as part of the lander missions for Apollos 11, 12, 14, 15, 16 and 17. The original film negatives are stored at JSC under cryogenic conditions. This effort is data restoration in the truest sense. The images represent the only record available to scientists which allows them to view the rock samples when making a sample request. As the negatives are being scanned, they are also being formatted and documented for permanent archive in the NASA Planetary Data System (PDS) archive. The ARES group is working collaboratively with the Imaging Node of the PDS on the archiving.
ALICE Data Release: A Revaluation of HST-NICMOS Coronagraphic Images
NASA Astrophysics Data System (ADS)
Hagan, J. Brendan; Choquet, Élodie; Soummer, Rémi; Vigan, Arthur
2018-04-01
The Hubble Space Telescope NICMOS instrument was used from 1997 to 2008 to perform coronagraphic observations of about 400 targets. Most of them were part of surveys looking for substellar companions or resolved circumstellar disks to young nearby stars, making the NICMOS coronagraphic archive a valuable database for exoplanets and disks studies. As part of the Archival Legacy Investigations of Circumstellar Environments program, we have consistently reprocessed a large fraction of the NICMOS coronagrahic archive using advanced starlight subtraction methods. We present here the high-level science products of these re-analyzed data, which we delivered back to the community through the Mikulski Archive for Space Telescopes: doi:10.17909/T9W89V. We also present the second version of the HCI-FITS format (for High-Contrast Imaging FITS format), which we developed as a standard format for data exchange of imaging reduced science products. These re-analyzed products are openly available for population statistics studies, characterization of specific targets, or detected point-source identification.
The ISO Data Archive and Interoperability with Other Archives
NASA Astrophysics Data System (ADS)
Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro
The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.
Semiautomated Workflow for Clinically Streamlined Glioma Parametric Response Mapping
Keith, Lauren; Ross, Brian D.; Galbán, Craig J.; Luker, Gary D.; Galbán, Stefanie; Zhao, Binsheng; Guo, Xiaotao; Chenevert, Thomas L.; Hoff, Benjamin A.
2017-01-01
Management of glioblastoma multiforme remains a challenging problem despite recent advances in targeted therapies. Timely assessment of therapeutic agents is hindered by the lack of standard quantitative imaging protocols for determining targeted response. Clinical response assessment for brain tumors is determined by volumetric changes assessed at 10 weeks post-treatment initiation. Further, current clinical criteria fail to use advanced quantitative imaging approaches, such as diffusion and perfusion magnetic resonance imaging. Development of the parametric response mapping (PRM) applied to diffusion-weighted magnetic resonance imaging has provided a sensitive and early biomarker of successful cytotoxic therapy in brain tumors while maintaining a spatial context within the tumor. Although PRM provides an earlier readout than volumetry and sometimes greater sensitivity compared with traditional whole-tumor diffusion statistics, it is not routinely used for patient management; an automated and standardized software for performing the analysis and for the generation of a clinical report document is required for this. We present a semiautomated and seamless workflow for image coregistration, segmentation, and PRM classification of glioblastoma multiforme diffusion-weighted magnetic resonance imaging scans. The software solution can be integrated using local hardware or performed remotely in the cloud while providing connectivity to existing picture archive and communication systems. This is an important step toward implementing PRM analysis of solid tumors in routine clinical practice. PMID:28286871
A Model Curriculum for the Education and Training of Archivists in Automation: A RAMP Study.
ERIC Educational Resources Information Center
Fishbein, M. H.
This RAMP (Records and Archives Management Programme) study is intended for people involved in planning and conducting archival and records management training; for individual archivists and records managers interested in professional development through continuing education programs; and for all information professionals interested in learning of…
NASA Technical Reports Server (NTRS)
Moore, Reagan W.
2004-01-01
The long-term preservation of digital entities requires mechanisms to manage the authenticity of massive data collections that are written to archival storage systems. Preservation environments impose authenticity constraints and manage the evolution of the storage system technology by building infrastructure independent solutions. This seeming paradox, the need for large archives, while avoiding dependence upon vendor specific solutions, is resolved through use of data grid technology. Data grids provide the storage repository abstractions that make it possible to migrate collections between vendor specific products, while ensuring the authenticity of the archived data. Data grids provide the software infrastructure that interfaces vendor-specific storage archives to preservation environments.
[Information management in multicenter studies: the Brazilian longitudinal study for adult health].
Duncan, Bruce Bartholow; Vigo, Álvaro; Hernandez, Émerson; Luft, Vivian Cristine; Ahlert, Hubert; Bergmann, Kaiser; Mota, Eduardo
2013-06-01
Information management in large multicenter studies requires a specialized approach. The Estudo Longitudinal da Saúde do Adulto (ELSA-Brasil - Brazilian Longitudinal Study for Adult Health) has created a Datacenter to enter and manage its data system. The aim of this paper is to describe the steps involved, including the information entry, transmission and management methods. A web system was developed in order to allow, in a safe and confidential way, online data entry, checking and editing, as well as the incorporation of data collected on paper. Additionally, a Picture Archiving and Communication System was implemented and customized for echocardiography and retinography. It stores the images received from the Investigation Centers and makes them available at the Reading Centers. Finally, data extraction and cleaning processes were developed to create databases in formats that enable analyses in multiple statistical packages.
NASA/IPAC Infrared Archive's General Image Cutouts Service
NASA Astrophysics Data System (ADS)
Alexov, A.; Good, J. C.
2006-07-01
The NASA/IPAC Infrared Archive (IRSA) ``Cutouts" Service (http://irsa.ipac.caltech.edu/applications/Cutouts) is a general tool for creating small ``cutout" FITS images and JPEGs from collections of data archived at IRSA. This service is a companion to IRSA's Atlas tool (http://irsa.ipac.caltech.edu/applications/Atlas/), which currently serves over 25 different data collections of various sizes and complexity and returns entire images for a user-defined region of the sky. The Cutouts Services sits on top of Atlas and extends the Atlas functionality by generating subimages at locations and sizes requested by the user from images already identified by Atlas. These results can be downloaded individually, in batch mode (using the program wget), or as a tar file. Cutouts re-uses IRSA's software architecture along with the publicly available Montage mosaicking tools. The advantages and disadvantages of this approach to generic cutout serving will be discussed.
36 CFR 1206.32 - What type of proposal is eligible for a records grant?
Code of Federal Regulations, 2010 CFR
2010-07-01
... institutions and organizations in archival and records management; (3) Improving the knowledge, performance... individuals for: (1) Advancing the state of the art in archival and records management and in the long-term...
Upper Klamath Basin Landsat Image for May 30, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-7 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-7 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-7 on April 15, 1999 marks the addition of the latest satellite to the Landsat series. The Landsat-7 satellite carries the Enhanced Thematic Mapper Plus (ETM+) sensor. A mechanical failure of the ETM+ Scan Line Corrector (SLC) occurred on May 31, 2003, with the result that all Landsat 7 scenes acquired from July 14, 2003 to present have been collected in 'SLC-off' mode. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for April 28, 2006: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-7 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-7 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-7 on April 15, 1999 marks the addition of the latest satellite to the Landsat series. The Landsat-7 satellite carries the Enhanced Thematic Mapper Plus (ETM+) sensor. A mechanical failure of the ETM+ Scan Line Corrector (SLC) occurred on May 31, 2003, with the result that all Landsat 7 scenes acquired from July 14, 2003 to present have been collected in 'SLC-off' mode. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 11, 2004: Path 45 Rows 30 and 31
Snyder, Daniel T.
2012-01-01
This image is a mosaic of Landsat-7 images of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-7 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-7 on April 15, 1999 marks the addition of the latest satellite to the Landsat series. The Landsat-7 satellite carries the Enhanced Thematic Mapper Plus (ETM+) sensor. A mechanical failure of the ETM+ Scan Line Corrector (SLC) occurred on May 31, 2003, with the result that all Landsat 7 scenes acquired from July 14, 2003 to present have been collected in 'SLC-off' mode. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Development of public science archive system of Subaro Telescope. 2
NASA Astrophysics Data System (ADS)
Yamamoto, Naotaka; Noda, Sachiyo; Taga, Masatoshi; Ozawa, Tomohiko; Horaguchi, Toshihiro; Okumura, Shin-Ichiro; Furusho, Reiko; Baba, Hajime; Yagi, Masafumi; Yasuda, Naoki; Takata, Tadafumi; Ichikawa, Shin-Ichi
2003-09-01
We report various improvements in a public science archive system, SMOKA (Subaru-Mitaka-Okayama-Kiso Archive system). We have developed a new interface to search observational data of minor bodies in the solar system. In addition, the other improvements (1) to search frames by specifying wavelength directly, (2) to find out calibration data set automatically, (3) to browse data on weather, humidity, and temperature, which provide information of image quality, (4) to provide quick-look images of OHS/CISCO and IRCS, and (5) to include the data from OAO HIDES (HIgh Dispersion Echelle Spectrograph), are also summarized.
The global Landsat archive: Status, consolidation, and direction
Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.
2016-01-01
New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.
The state of the art of medical imaging technology: from creation to archive and back.
Gao, Xiaohong W; Qian, Yu; Hui, Rui
2011-01-01
Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations.
The State of the Art of Medical Imaging Technology: from Creation to Archive and Back
Gao, Xiaohong W; Qian, Yu; Hui, Rui
2011-01-01
Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations. PMID:21915232
Landsat View: Las Vegas, Nevada
2017-12-08
Over the years of the Landsat program, the desert city of Las Vegas has gone through a massive growth spurt. The outward expansion of the city over the last quarter of a century is shown here with two false-color Landsat 5 images (August 3, 1984, and November 2, 2011). The dark purple grid of city streets and the green of irrigated vegetation grow out in every direction into the surrounding desert. These images were created using reflected light from the shortwave infrared, near-infrared, and green portions of the electromagnetic spectrum (Landsat 5 TM bands 7,4,2). ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
Between 1985 and 2009, the population of Tehran, Iran, grew from six million to just over seven million. The city's growth was spurred largely by migration from other parts of the country. In addition to being the hub of government and associated public sector jobs, Tehran houses more than half of Iran's industry. Landsat 5 acquired these false-color images of Tehran on August 2, 1985, and July 19, 2009. The city is a web of dark purple lines, vegetation is green and bare ground is pink and tan. The images were created using both infrared and visible light (band combination 7, 4, and 2) to distinguish urban areas from the surrounding desert. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Reiterating "Asylum Archive": Documenting Direct Provision in Ireland
ERIC Educational Resources Information Center
Nedeljkovic, Vukasin
2018-01-01
Originally a coping mechanism for an artist housed in a Direct Provision Centres while seeking asylum in Ireland, "Asylum Archive" has become much more than that. In 2018, it is now a collaborative archive, interactive and intermedial online document, and a scholarly research project. This iteration includes five new images of Railway…
Copple, Susan S.; Jaskowski, Troy D.; Giles, Rashelle; Hill, Harry R.
2014-01-01
Objective. To evaluate NOVA View with focus on reading archived images versus microscope based manual interpretation of ANA HEp-2 slides by an experienced, certified medical technologist. Methods. 369 well defined sera from: 44 rheumatoid arthritis, 50 systemic lupus erythematosus, 35 scleroderma, 19 Sjögren's syndrome, and 10 polymyositis patients as well as 99 healthy controls were examined. In addition, 12 defined sera from the Centers for Disease Control and 100 random patient sera sent to ARUP Laboratories for ANA HEp-2 IIF testing were included. Samples were read using the archived images on NOVA View and compared to results obtained from manual reading. Results. At a 1 : 40/1 : 80 dilution the resulting comparison demonstrated 94.8%/92.9% positive, 97.4%/97.4% negative, and 96.5%/96.2% total agreements between manual IIF and NOVA View archived images. Agreement of identifiable patterns between methods was 97%, with PCNA and mixed patterns undetermined. Conclusion. Excellent agreements were obtained between reading archived images on NOVA View and manually on a fluorescent microscope. In addition, workflow benefits were observed which need to be analyzed in future studies. PMID:24741573
NASA Astrophysics Data System (ADS)
Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa
2016-04-01
We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.
Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.
Lear, J L; Pratt, J P; Trujillo, N
1996-02-01
While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.
The challenge of archiving and preserving remotely sensed data
Faundeen, John L.
2003-01-01
Few would question the need to archive the scientific and technical (S&T) data generated by researchers. At a minimum, the data are needed for change analysis. Likewise, most people would value efforts to ensure the preservation of the archived S&T data. Future generations will use analysis techniques not even considered today. Until recently, archiving and preserving these data were usually accomplished within existing infrastructures and budgets. As the volume of archived data increases, however, organizations charged with archiving S&T data will be increasingly challenged (U.S. General Accounting Office, 2002). The U.S. Geological Survey has had experience in this area and has developed strategies to deal with the mountain of land remote sensing data currently being managed and the tidal wave of expected new data. The Agency has dealt with archiving issues, such as selection criteria, purging, advisory panels, and data access, and has met with preservation challenges involving photographic and digital media. That experience has allowed the USGS to develop management approaches, which this paper outlines.
Archived data management systems : a cross-cutting study : linking operations and planning data
DOT National Transportation Integrated Search
2005-12-01
This report examines five transportation agencies that have established and are operating successful ADMSs (Archived Data Management Systems), and one that is on the verge of becoming fully operational. This study discusses the design choices, operat...
Contrast in Terahertz Images of Archival Documents—Part II: Influence of Topographic Features
NASA Astrophysics Data System (ADS)
Bardon, Tiphaine; May, Robert K.; Taday, Philip F.; Strlič, Matija
2017-04-01
We investigate the potential of terahertz time-domain imaging in reflection mode to reveal archival information in documents in a non-invasive way. In particular, this study explores the parameters and signal processing tools that can be used to produce well-contrasted terahertz images of topographic features commonly found in archival documents, such as indentations left by a writing tool, as well as sieve lines. While the amplitude of the waveforms at a specific time delay can provide the most contrasted and legible images of topographic features on flat paper or parchment sheets, this parameter may not be suitable for documents that have a highly irregular surface, such as water- or fire-damaged documents. For analysis of such documents, cross-correlation of the time-domain signals can instead yield images with good contrast. Analysis of the frequency-domain representation of terahertz waveforms can also provide well-contrasted images of topographic features, with improved spatial resolution when utilising high-frequency content. Finally, we point out some of the limitations of these means of analysis for extracting information relating to topographic features of interest from documents.
Implementation of an ASP model offsite backup archive for clinical images utilizing Internet 2
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Chao, Sander S.; Documet, Jorge; Lee, Jasper; Lee, Michael; Topic, Ian; Williams, Lanita
2005-04-01
With the development of PACS technology and an increasing demand by medical facilities to become filmless, there is a need for a fast and efficient method of providing data backup for disaster recovery and downtime scenarios. At the Image Processing Informatics Lab (IPI), an ASP Backup Archive was developed using a fault-tolerant server with a T1 connection to serve the PACS at the St. John's Health Center (SJHC) Santa Monica, California. The ASP archive server has been in clinical operation for more than 18 months, and its performance was presented at this SPIE Conference last year. This paper extends the ASP Backup Archive to serve the PACS at the USC Healthcare Consultation Center II (HCC2) utilizing an Internet2 connection. HCC2 is a new outpatient facility that recently opened in April 2004. The Internet2 connectivity between USC's HCC2 and IPI has been established for over one year. There are two novelties of the current ASP model: 1) Use of Internet2 for daily clinical operation, and 2) Modifying the existing backup archive to handle two sites in the ASP model. This paper presents the evaluation of the ASP Backup Archive based on the following two criteria: 1) Reliability and performance of the Internet2 connection between HCC2 and IPI using DICOM image transfer in a clinical environment, and 2) Ability of the ASP Fault-Tolerant backup archive to support two separate clinical PACS sites simultaneously. The performances of using T1 and Internet2 at the two different sites are also compared.
NASA Astrophysics Data System (ADS)
Civera Lorenzo, Tamara
2017-10-01
Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol
Integration experiences and performance studies of A COTS parallel archive systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Bary
2010-01-01
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less
Integration experiments and performance studies of a COTS parallel archive system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Gary
2010-06-16
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Chini, Marco; Matgen, Patrick; Giustarini, Laura
2013-04-01
There is a clear need for developing innovative processing chains based on earth observation (EO) data to generate products supporting emergency response and flood management at a global scale. Here an automatic flood mapping application is introduced. The latter is currently hosted on the Grid Processing on Demand (G-POD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver flooded areas using both recent and historical acquisitions of SAR data in an operational framework. It is worth mentioning that the method can be applied to both medium and high resolution SAR images. The flood mapping application consists of two main blocks: 1) A set of query tools for selecting the "crisis image" and the optimal corresponding pre-flood "reference image" from the G-POD archive. 2) An algorithm for extracting flooded areas using the previously selected "crisis image" and "reference image". The proposed method is a hybrid methodology, which combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method is based on the calibration of a statistical distribution of "open water" backscatter values inferred from SAR images of floods. Change detection with respect to a pre-flood reference image helps reducing over-detection of inundated areas. The algorithms are computationally efficient and operate with minimum data requirements, considering as input data a flood image and a reference image. Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate pre-flood reference image. Potential users will also be able to apply the implemented flood delineation algorithm. Case studies of several recent high magnitude flooding events (e.g. July 2007 Severn River flood, UK and March 2010 Red River flood, US) observed by high-resolution SAR sensors as well as airborne photography highlight advantages and limitations of the online application. A mid-term target is the exploitation of ESA SENTINEL 1 SAR data streams. In the long term it is foreseen to develop a potential extension of the application for systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis. On-going research activities investigate the usefulness of the method for mapping flood hazard at global scale using databases of historic SAR remote sensing-derived flood inundation maps.
The Starchive: An open access, open source archive of nearby and young stars and their planets
NASA Astrophysics Data System (ADS)
Tanner, Angelle; Gelino, Chris; Elfeki, Mario
2015-12-01
Historically, astronomers have utilized a piecemeal set of archives such as SIMBAD, the Washington Double Star Catalog, various exoplanet encyclopedias and electronic tables from the literature to cobble together stellar and exo-planetary parameters in the absence of corresponding images and spectra. As the search for planets around young stars through direct imaging, transits and infrared/optical radial velocity surveys blossoms, there is a void in the available set of to create comprehensive lists of the stellar parameters of nearby stars especially for important parameters such as metallicity and stellar activity indicators. For direct imaging surveys, we need better resources for downloading existing high contrast images to help confirm new discoveries and find ideal target stars. Once we have discovered new planets, we need a uniform database of stellar and planetary parameters from which to look for correlations to better understand the formation and evolution of these systems. As a solution to these issues, we are developing the Starchive - an open access stellar archive in the spirit of the open exoplanet catalog, the Kepler Community Follow-up Program and many others. The archive will allow users to download various datasets, upload new images, spectra and metadata and will contain multiple plotting tools to use in presentations and data interpretations. While we will highly regulate and constantly validate the data being placed into our archive the open nature of its design is intended to allow the database to be expanded efficiently and have a level of versatility which is necessary in today's fast moving, big data community. Finally, the front-end scripts will be placed on github and users will be encouraged to contribute new plotting tools. Here, I will introduce the community to the content and expected capabilities of the archive and query the audience for community feedback.
Advanced digital image archival system using MPEG technologies
NASA Astrophysics Data System (ADS)
Chang, Wo
2009-08-01
Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.
Conversion of a traditional image archive into an image resource on compact disc.
Andrew, S M; Benbow, E W
1997-01-01
The conversion of a traditional archive of pathology images was organised on 35 mm slides into a database of images stored on compact disc (CD-ROM), and textual descriptions were added to each image record. Students on a didactic pathology course found this resource useful as an aid to revision, despite relative computer illiteracy, and it is anticipated that students on a new problem based learning course, which incorporates experience with information technology, will benefit even more readily when they use the database as an educational resource. A text and image database on CD-ROM can be updated repeatedly, and the content manipulated to reflect the content and style of the courses it supports. Images PMID:9306931
Influence of imaging resolution on color fidelity in digital archiving.
Zhang, Pengchang; Toque, Jay Arre; Ide-Ektessabi, Ari
2015-11-01
Color fidelity is of paramount importance in digital archiving. In this paper, the relationship between color fidelity and imaging resolution was explored by calculating the color difference of an IT8.7/2 color chart with a CIELAB color difference formula for scanning and simulation images. Microscopic spatial sampling was used in selecting the image pixels for the calculations to highlight the loss of color information. A ratio, called the relative imaging definition (RID), was defined to express the correlation between image resolution and color fidelity. The results show that in order for color differences to remain unrecognizable, the imaging resolution should be at least 10 times higher than the physical dimension of the smallest feature in the object being studied.
Data archiving and serving system implementation in CLEP's GRAS Core System
NASA Astrophysics Data System (ADS)
Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai
2017-04-01
The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.
NASA Astrophysics Data System (ADS)
Petitjean, Gilles; de Hauteclocque, Bertrand
2004-06-01
EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.
Archives: New Horizons in Astronomy
NASA Astrophysics Data System (ADS)
Bobis, L.; Laurenceau, A.
2010-10-01
The scientific archives in the Paris Observatory's library date back to the XVIIth century. In addition to the preservation and the valorisation of these historic archives, the library is also responsible for the efficient and timely management of contemporary documents to ensure their optimum conservation and identification once they become historical. Oral, iconographic and electronic documents complement these paper archives.
ERIC Educational Resources Information Center
Bratslavsky, Lauren Michelle
2013-01-01
The dissertation offers a historical inquiry about how television's material traces entered archival spaces. Material traces refer to both the moving image products and the assortment of documentation about the processes of television as industrial and creative endeavors. By identifying the development of television-specific archives and…
The Internet as a Medium of Training for Picture Archival and Communication Systems (PACS).
ERIC Educational Resources Information Center
Majid, Shaheen; Misra, Ramesh Kumar
2002-01-01
Explores the potential of Web-based training for PACS (Picture Archival and Communication Systems) used in radiology departments for the storage and archiving of patients' medical images. Reports results of studies in three hospitals in Malaysia, Singapore and the Philippines that showed that the Internet can be used effectively for training.…
Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.
1991-01-01
The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.
USNO Image and Catalog Archive Server - Naval Oceanography Portal
are here: Home ⺠USNO ⺠Astrometry ⺠Optical/IR Products ⺠USNO Image and Catalog Archive Server USNO Logo USNO Navigation Optical/IR Products NOMAD UCAC URAT USNO-B1.0 Double Stars Solar System Link Disclaimer This is an official U.S. Navy web site. Security & Privacy Policy Veterans Crisis
NASA Astrophysics Data System (ADS)
Giampa', Vincenzo; Pasqua, A. Aurora; Petrucci, Olga
2015-04-01
The paper firstly presents the historical archive of Cosenza IRPI Section and the historical database that has been built basing on the data contained in it. Then, an application of these data to Catanzaro, the town that is the administrative center of Calabria region (Southern Italy), is presented. The gathering of historical data on past floods and landslides in Cosenza IRPI Section has been started since 1996, and it is still in progress. In 2005, some donations coming from regional and municipal Public Works offices greatly increased the documental corpus, and required a more incisive classification and management that led us to organize the documents in a real historical archive. Documents were sorted according to municipalities they concerned. In this way, for each of the 409 municipalities of Calabria a set of documents, maps and images was available. Collected documents mainly concern damage caused by the occurrence, since XIX century, of phenomena as floods, flash floods and landslides triggered by extreme meteorological events, or even damage caused by strong earthquakes. At the beginning of 2014, the central office of IRPI (Perugia) funded a project aiming to the digitalization of the archive and the subsequent publication of it on a web-platform. In this paper, the procedure adopted to build the archive and implement the database is described. Then, the elaboration of the historical series of data on Catanzaro town, which has been frequently damaged by rainfall-induced landslides and floods, is also presented. Basing on the documents coming from the archive of Ministry Public Works and stored in our Historical Archive, an assessment of costs related to damage that during XX century affected the houses of this town has been performed. The research pointed out the types of most damaging phenomena, the municipal sectors most frequently damaged, and the evolution of damaged areas throughout the years according to the increasing urbanization.
Enterprise utilization of "always on-line" diagnostic study archive.
McEnery, Kevin W; Suitor, Charles T; Thompson, Stephen K; Shepard, Jeffrey S; Murphy, William A
2002-01-01
To meet demands for enterprise image distribution, an "always on-line" image storage archive architecture was implemented before soft copy interpretation. It was presumed that instant availability of historical diagnostic studies would elicit a substantial utilization. Beginning November 1, 2000 an enterprise distribution archive was activated (Stentor, SanFrancisco, CA). As of August 8, 2001, 83,052 studies were available for immediate access without the need for retrieval from long-term archive. Image storage and retrieval logs for the period from June 12, 2001 to August 8, 2001 were analyzed. A total of 41,337 retrieval requests were noted for the 83,052 studies available as August 8, 2001. Computed radiography represented 16.8% of retrieval requests; digital radiography, 16.9%; computed tomography (CT), 44.5%; magnetic resonance (MR), 19.2%; and ultrasonography, 2.6%. A total of 51.5% of study retrievals were for studies less than 72 hours old. Study requests for cases greater than 100 days old represented 9.9% of all accessions, 9.7% of CT accessions, and 15.4% of MR accessions. Utilization of the archive indicates a substantial proportion of study retrievals for studies less than 72 hours after study completion. However, significant interest in historical CT and MR examinations was shown.
Autosophy: an alternative vision for satellite communication, compression, and archiving
NASA Astrophysics Data System (ADS)
Holtz, Klaus; Holtz, Eric; Kalienky, Diana
2006-08-01
Satellite communication and archiving systems are now designed according to an outdated Shannon information theory where all data is transmitted in meaningless bit streams. Video bit rates, for example, are determined by screen size, color resolution, and scanning rates. The video "content" is irrelevant so that totally random images require the same bit rates as blank images. An alternative system design, based on the newer Autosophy information theory, is now evolving, which transmits data "contend" or "meaning" in a universally compatible 64bit format. This would allow mixing all multimedia transmissions in the Internet's packet stream. The new systems design uses self-assembling data structures, which grow like data crystals or data trees in electronic memories, for both communication and archiving. The advantages for satellite communication and archiving may include: very high lossless image and video compression, unbreakable encryption, resistance to transmission errors, universally compatible data formats, self-organizing error-proof mass memories, immunity to the Internet's Quality of Service problems, and error-proof secure communication protocols. Legacy data transmission formats can be converted by simple software patches or integrated chipsets to be forwarded through any media - satellites, radio, Internet, cable - without needing to be reformatted. This may result in orders of magnitude improvements for all communication and archiving systems.
Building a DAM To Last: Archiving Digital Assets.
ERIC Educational Resources Information Center
Zeichick, Alan
2003-01-01
Discusses archiving digital information and the need for organizations to develop policies regarding digital asset management (DAM) and storage. Topics include determining the value of digital assets; formats of digital information; use of stored information; and system architecture, including hardware and asset management software. (LRW)
Martinez, R; Cole, C; Rozenblit, J; Cook, J F; Chacko, A K
2000-05-01
The US Army Great Plains Regional Medical Command (GPRMC) has a requirement to conform to Department of Defense (DoD) and Army security policies for the Virtual Radiology Environment (VRE) Project. Within the DoD, security policy is defined as the set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information. Security policy in the DoD is described by the Trusted Computer System Evaluation Criteria (TCSEC), Army Regulation (AR) 380-19, Defense Information Infrastructure Common Operating Environment (DII COE), Military Health Services System Automated Information Systems Security Policy Manual, and National Computer Security Center-TG-005, "Trusted Network Interpretation." These documents were used to develop a security policy that defines information protection requirements that are made with respect to those laws, rules, and practices that are required to protect the information stored and processed in the VRE Project. The goal of the security policy is to provide for a C2-level of information protection while also satisfying the functional needs of the GPRMC's user community. This report summarizes the security policy for the VRE and defines the CORBA security services that satisfy the policy. In the VRE, the information to be protected is embedded into three major information components: (1) Patient information consists of Digital Imaging and Communications in Medicine (DICOM)-formatted fields. The patient information resides in the digital imaging network picture archiving and communication system (DIN-PACS) networks in the database archive systems and includes (a) patient demographics; (b) patient images from x-ray, computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound (US); and (c) prior patient images and related patient history. (2) Meta-Manager information to be protected consists of several data objects. This information is distributed to the Meta-Manager nodes and includes (a) radiologist schedules; (b) modality worklists; (c) routed case information; (d) DIN-PACS and Composite Health Care system (CHCS) messages, and Meta-Manager administrative and security information; and (e) patient case information. (3) Access control and communications security is required in the VRE to control who uses the VRE and Meta-Manager facilities and to secure the messages between VRE components. The CORBA Security Service Specification version 1.5 is designed to allow up to TCSEC's B2-level security for distributed objects. The CORBA Security Service Specification defines the functionality of several security features: identification and authentication, authorization and access control, security auditing, communication security, nonrepudiation, and security administration. This report describes the enhanced security features for the VRE and their implementation using commercial CORBA Security Service software products.
NASA space and Earth science data on CD-ROM
NASA Technical Reports Server (NTRS)
Towheed, Syed S.
1993-01-01
The National Space Science Data Center (NSSDC) is very interested in facilitating the widest possible use of the scientific data acquired through NASA spaceflight missions. Therefore, NSSDC has participated with projects and data management elements throughout the NASA science environment in the creation, archiving, and dissemination of data using Compact Disk-Read Only Memory (CD-ROM). This CD-ROM technology has the potential to enable the dissemination of very large data volumes at very low prices to a great many researchers, students and their teachers, and others. This catalog identifies and describes the scientific CD-ROM's now available from NSSDC including the following data sets: Einstein Observatory CD-ROM, Galileo Cruise Imaging on CD-ROM, International Halley Watch, IRAS Sky Survey Atlas, Infrared Thermal Mapper (IRTM), Magellan (MIDR), Magellan (ARCDR's), Magellan (GxDR's), Mars Digital Image Map (MDIM), Outer Planets Fields & Particles Data, Pre-Magellan, Selected Astronomical Catalogs, TOMS Gridded Ozone Data, TOMS Ozone Image Data, TOMS Update, Viking Orbiter Images of Mars, and Voyager Image.
Diagnosis and prediction of neuroendocrine liver metastases: a protocol of six systematic reviews.
Arigoni, Stephan; Ignjatovic, Stefan; Sager, Patrizia; Betschart, Jonas; Buerge, Tobias; Wachtl, Josephine; Tschuor, Christoph; Limani, Perparim; Puhan, Milo A; Lesurtel, Mickael; Raptis, Dimitri A; Breitenstein, Stefan
2013-12-23
Patients with hepatic metastases from neuroendocrine tumors (NETs) benefit from an early diagnosis, which is crucial for the optimal therapy and management. Diagnostic procedures include morphological and functional imaging, identification of biomarkers, and biopsy. The aim of six systematic reviews discussed in this study is to assess the predictive value of Ki67 index and other biomarkers, to compare the diagnostic accuracy of morphological and functional imaging, and to define the role of biopsy in the diagnosis and prediction of neuroendocrine tumor liver metastases. An objective group of librarians will provide an electronic search strategy to examine the following databases: MEDLINE, EMBASE and The Cochrane Library (Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials (CENTRAL), Database of Abstracts of Reviews of Effects). There will be no restriction concerning language and publication date. The qualitative and quantitative synthesis of the systematic review will be conducted with randomized controlled trials (RCT), prospective and retrospective comparative cohort studies, and case-control studies. Case series will be collected in a separate database and only used for descriptive purposes. This study is ongoing and presents a protocol of six systematic reviews to elucidate the role of histopathological and biochemical markers, biopsies of the primary tumor and the metastases as well as morphological and functional imaging modalities for the diagnosis and prediction of neuroendocrine liver metastases. These systematic reviews will assess the value and accuracy of several diagnostic modalities in patients with NET liver metastases, and will provide a basis for the development of clinical practice guidelines. The systematic reviews have been prospectively registered with the International Prospective Register of Systematic Reviews (PROSPERO): CRD42012002644; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2644 (Archived by WebCite at http://www.webcitation.org/6LzCLd5sF), CRD42012002647; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2647 (Archived by WebCite at http://www.webcitation.org/6LzCRnZnO), CRD42012002648; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2648 (Archived by WebCite at http://www.webcitation.org/6LzCVeuVR), CRD42012002649; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2649 (Archived by WebCite at http://www.webcitation.org/6LzCZzZWU), CRD42012002650; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2650 (Archived by WebCite at http://www.webcitation.org/6LzDPhGb8), CRD42012002651; http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012002651#.UrMglPRDuVo (Archived by WebCite at http://www.webcitation.org/6LzClCNff).
Biparametric MRI of the prostate.
Scialpi, Michele; D'Andrea, Alfredo; Martorana, Eugenio; Malaspina, Corrado Maria; Aisa, Maria Cristina; Napoletano, Maria; Orlandi, Emanuele; Rondoni, Valeria; Scialpi, Pietro; Pacchiarini, Diamante; Palladino, Diego; Dragone, Michele; Di Renzo, Giancarlo; Simeone, Annalisa; Bianchi, Giampaolo; Brunese, Luca
2017-12-01
Biparametric Magnetic Resonance Imaging (bpMRI) of the prostate combining both morphologic T2-weighted imaging (T2WI) and diffusion-weighted imaging (DWI) is emerging as an alternative to multiparametric MRI (mpMRI) to detect, to localize and to guide prostatic targeted biopsy in patients with suspicious prostate cancer (PCa). BpMRI overcomes some limitations of mpMRI such as the costs, the time required to perform the study, the use of gadolinium-based contrast agents and the lack of a guidance for management of score 3 lesions equivocal for significant PCa. In our experience the optimal and similar clinical results of the bpMRI in comparison to mpMRI are essentially related to the DWI that we consider the dominant sequence for detection suspicious PCa both in transition and in peripheral zone. In clinical practice, the adoption of bpMRI standardized scoring system, indicating the likelihood to diagnose a clinically significant PCa and establishing the management of each suspicious category (from 1 to 4), could represent the rationale to simplify and to improve the current interpretation of mpMRI based on Prostate Imaging and Reporting Archiving Data System version 2 (PI-RADS v2). In this review article we report and describe the current knowledge about bpMRI in the detection of suspicious PCa and a simplified PI-RADS based on bpMRI for management of each suspicious PCa categories to facilitate the communication between radiologists and urologists.
Biparametric MRI of the prostate
Scialpi, Michele; D’Andrea, Alfredo; Martorana, Eugenio; Malaspina, Corrado Maria; Aisa, Maria Cristina; Napoletano, Maria; Orlandi, Emanuele; Rondoni, Valeria; Scialpi, Pietro; Pacchiarini, Diamante; Palladino, Diego; Dragone, Michele; Di Renzo, Giancarlo; Simeone, Annalisa; Bianchi, Giampaolo; Brunese, Luca
2017-01-01
Biparametric Magnetic Resonance Imaging (bpMRI) of the prostate combining both morphologic T2-weighted imaging (T2WI) and diffusion-weighted imaging (DWI) is emerging as an alternative to multiparametric MRI (mpMRI) to detect, to localize and to guide prostatic targeted biopsy in patients with suspicious prostate cancer (PCa). BpMRI overcomes some limitations of mpMRI such as the costs, the time required to perform the study, the use of gadolinium-based contrast agents and the lack of a guidance for management of score 3 lesions equivocal for significant PCa. In our experience the optimal and similar clinical results of the bpMRI in comparison to mpMRI are essentially related to the DWI that we consider the dominant sequence for detection suspicious PCa both in transition and in peripheral zone. In clinical practice, the adoption of bpMRI standardized scoring system, indicating the likelihood to diagnose a clinically significant PCa and establishing the management of each suspicious category (from 1 to 4), could represent the rationale to simplify and to improve the current interpretation of mpMRI based on Prostate Imaging and Reporting Archiving Data System version 2 (PI-RADS v2). In this review article we report and describe the current knowledge about bpMRI in the detection of suspicious PCa and a simplified PI-RADS based on bpMRI for management of each suspicious PCa categories to facilitate the communication between radiologists and urologists. PMID:29201499
Commission 5: Documentation and Astronomical Data
NASA Astrophysics Data System (ADS)
Ohishi, Masatoshi; Hanisch, Robert J.; Norris, Ray P.; Andernach, Heinz; Bishop, Marsha; Griffin, Elizabeth; Kembhavi, Ajit; Murphy, Tara; Pasian, Fabio
2012-04-01
IAU Commission 5 (http://www.nao.ac.jp/IAU/Com5/) deals with data management issues, and its working groups and task group deal specifically with information handling, with data centers and networks, with technical aspects of collection, archiving, storage and dissemination of data, with designations and classification of astronomical objects, with library services, editorial policies, computer communications, ad hoc methodologies, and with various standards, reference frames, etc. FITS (Flexible Image Transport System), the major data exchange format in astronomy, has been standardized, maintained and updated by the FITS working group under Commission 5.
The DICOM Standard: A Brief Overview
NASA Astrophysics Data System (ADS)
Gibaud, Bernard
The DICOM standard has now become the uncontested standard for the exchange and management of biomedical images. Everyone acknowledges its prominent role in the emergence of multi-vendor Picture Archiving and Communication Systems (PACS), and their successful integration with Hospital Information Systems and Radiology Information Systems, thanks to the Integrating the Healthcare Enterprise (IHE) initiative. We introduce here the basic concepts retained for the definition of objects and services in DICOM, with the hope that it will help the reader to find his or her way in the vast DICOM documentation available on the web.
Charting the Course: Life Cycle Management of Mars Mission Digital Information
NASA Technical Reports Server (NTRS)
Reiz, Julie M.
2003-01-01
This viewgraph presentation reviews the life cycle management of MER Project information. This process was an essential key to the successful launch of the MER Project rovers. Incorporating digital information archive requirements early in the project life cycle resulted in: Design of an information system that included archive metadata, Reduced the risk of information loss through in-process appraisal, Easier transfer of project information to institutional online archive and Project appreciation for preserving information for reuse by future projects
A flexible, open, decentralized system for digital pathology networks.
Schuler, Robert; Smith, David E; Kumaraguruparan, Gowri; Chervenak, Ann; Lewis, Anne D; Hyde, Dallas M; Kesselman, Carl
2012-01-01
High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers.
A Flexible, Open, Decentralized System for Digital Pathology Networks
SMITH, David E.; KUMARAGURUPARAN, Gowri; CHERVENAK, Ann; LEWIS, Anne D.; HYDE, Dallas M.; KESSELMAN, Carl
2014-01-01
High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers. PMID:22941985
NASA Astrophysics Data System (ADS)
Swade, Daryl; Bushouse, Howard; Greene, Gretchen; Swam, Michael
2014-07-01
Science data products for James Webb Space Telescope (JWST) ©observations will be generated by the Data Management Subsystem (DMS) within the JWST Science and Operations Center (S&OC) at the Space Telescope Science Institute (STScI). Data processing pipelines within the DMS will produce uncalibrated and calibrated exposure files, as well as higher level data products that result from combined exposures, such as mosaic images. Information to support the science observations, for example data from engineering telemetry, proposer inputs, and observation planning will be captured and incorporated into the science data products. All files will be generated in Flexible Image Transport System (FITS) format. The data products will be made available through the Mikulski Archive for Space Telescopes (MAST) and adhere to International Virtual Observatory Alliance (IVOA) standard data protocols.
Goldstone Tracking the Echo Satelloon.
2016-10-27
This archival image was released as part of a gallery comparing JPL’s past and present, commemorating the 80th anniversary of NASA’s Jet Propulsion Laboratory on Oct. 31, 2016. This photograph shows the first pass of Echo 1, NASA's first communications satellite, over the Goldstone Tracking Station managed by NASA's Jet Propulsion Laboratory, in Pasadena, California, in the early morning of Aug. 12, 1960. The movement of the antenna, star trails (shorter streaks), and Echo 1 (the long streak in the middle) are visible in this image. Project Echo bounced radio signals off a 10-story-high, aluminum-coated balloon orbiting the Earth. This form of "passive" satellite communication -- which mission managers dubbed a "satelloon" -- was an idea conceived by an engineer from NASA's Langley Research Center in Hampton, Virginia, and was a project managed by NASA's Goddard Space Flight Center in Greenbelt, Maryland. JPL's role involved sending and receiving signals through two of its 85-foot-diameter (26-meter-diameter) antennas at the Goldstone Tracking Station in California's Mojave Desert. The Goldstone station later became part of NASA's Deep Space Network. JPL, a division of Caltech in Pasadena, California, manages the Deep Space Network for NASA. http://photojournal.jpl.nasa.gov/catalog/PIA21114
NASA Astrophysics Data System (ADS)
Velasco, Almudena; Gutiérrez, Raúl; Solano, Enrique; García-Torres, Miguel; López, Mauro; Sarro, Luis Manuel
We describe here the main capabilities of the COROT archive. The archive (http://sdc.laeff.inta.es/corotfa/jsp/searchform.jsp), managed at LAEFF in the framework of the Spanish Virtual Observatory (http://svo.laeff.inta.es), has been developed following the standards and requirements defined by IVOA (http://www.ivoa.net). The COROT archive at LAEFF will be publicly available by the end of 2008.
ERIC Educational Resources Information Center
Huvila, Isto
2016-01-01
Introduction: This paper analyses the work practices and perspectives of professionals working with archaeological archives and the social organization of archaeological archiving and information management in Sweden. Method: The paper is based on an interview study of Swedish actors in the field of archaeological archiving (N = 16). Analysis: The…
The challenge of a data storage hierarchy
NASA Technical Reports Server (NTRS)
Ruderman, Michael
1992-01-01
A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.
Applied imaging at the NASA Lewis Research Center
NASA Astrophysics Data System (ADS)
Slater, Howard A.; Owens, Jay C.
1993-01-01
NASA Lewis Research Center in Cleveland, Ohio has just completed the celebration of its 50th anniversary. `During the past 50 years, Lewis helped win World War II, made jet aircraft safer and more efficient, helped Americans land on the Moon ... and engaged in the type of fundamental research that benefits all of us in our daily lives.' As part of the center's long history, the Photographic and Printing Branch has continued to develop and meet the center's research imaging requirements. As imaging systems continue to advance and researchers more clearly understand the power of imaging, investigators are relying more and more on imaging systems to meet program objectives. Today, the Photographic and Printing Branch supports a research community of over 5,000 including advocacy for NASA Headquarters and other government agencies. Complete classified and unclassified imaging services include high- speed image acquisition, technical film and video documentaries, still imaging, and conventional and unconventional photofinishing operations. These are the foundation of the branch's modern support function. This paper provides an overview of the varied applied imaging programs managed by the Photographic and Printing Branch. Emphasis is placed on recent imaging projects including icing research, space experiments, and an on-line image archive.
LANDSAT-D data format control book. Volume 6, appendix G: GSFC HDT-AM inventory tape (GHIT-AM)
NASA Technical Reports Server (NTRS)
1981-01-01
The data format specifications of the Goddard HDT inventory tapes (GHITS), which accompany shipments of archival digital multispectral scanner image data (HDT-AM tapes), are defined. The GHIT is a nine-track, 1600-BPI tape which conforms to the ANSI standard and serves as an inventory and description of the image data included in the shipment. The archival MSS tapes (HDT-AMs) contain radiometrically corrected but geometrically uncorrected image data plus certain ancillary data necessary to perform the geometric corrections.
Knowledge-driven information mining in remote-sensing image archives
NASA Astrophysics Data System (ADS)
Datcu, M.; Seidel, K.; D'Elia, S.; Marchetti, P. G.
2002-05-01
Users in all domains require information or information-related services that are focused, concise, reliable, low cost and timely and which are provided in forms and formats compatible with the user's own activities. In the current Earth Observation (EO) scenario, the archiving centres generally only offer data, images and other "low level" products. The user's needs are being only partially satisfied by a number of, usually small, value-adding companies applying time-consuming (mostly manual) and expensive processes relying on the knowledge of experts to extract information from those data or images.
Measurements of 100 'Critical' Minor Planets from NEAT Archive
NASA Astrophysics Data System (ADS)
Deshmukh, Shishir
2017-07-01
Uncertainties associated with the orbits of minor planets can be reduced by analyzing archival imagery as attempted in the current investigation. Archival images from NEAT and NASA’s Skymorph database were analyzed using standard software to identify the minor planets listed in the critical list. Findings of each minor planet were submitted to Minor Planet Center (MPC) to offer better orbital solutions.
Archiving Microgravity Flight Data and Samples
NASA Technical Reports Server (NTRS)
1996-01-01
To obtain help in evaluating its current strategy for archiving data and samples obtained in microgravity research, NASA's Microgravity Science and Applications Division (MSAD) asked the Space Studies Board's Committee on Microgravity Research for guidance on the following questions: What data should be archived and where should it be kept? In what form should the data be maintained (electronic files, photographs, hard copy, samples)? What should the general format of the database be? To what extent should it be universally accessible and through what mechanisms? Should there be a period of time for which principal investigators have proprietary access? If so, how long should proprietary data be stored? What provisions should be made for data obtained from ground-based experiments? What should the deadline be for investigators placing their data in the archive? How long should data be saved? How long should data be easily accessible? As a prelude to making recommendations for optimum selection and storage of microgravity data and samples, the committee in this report briefly describes NASA's past archiving practices and outlines MSAD's current archiving strategy. Although the committee found that only a limited number of experiments have thus far been archived, it concluded that the general archiving strategy, characterized by MSAD as minimalist, appears viable. A central focus of attention is the Experiment Data Management Plan (EDMP), MSAD's recently instituted data management and archiving framework for flight experiments. Many of the report's recommendations are aimed at enhancing the effectiveness of the EDMP approach, which the committee regards as an appropriate data management method for MSAD. Other recommendations provide guidance on broader issues related to the questions listed above. This report does not address statutory or regulatory records retention requirements.
Developing national on-line services to annotate and analyse underwater imagery in a research cloud
NASA Astrophysics Data System (ADS)
Proctor, R.; Langlois, T.; Friedman, A.; Davey, B.
2017-12-01
Fish image annotation data is currently collected by various research, management and academic institutions globally (+100,000's hours of deployments) with varying degrees of standardisation and limited formal collaboration or data synthesis. We present a case study of how national on-line services, developed within a domain-oriented research cloud, have been used to annotate habitat images and synthesise fish annotation data sets collected using Autonomous Underwater Vehicles (AUVs) and baited remote underwater stereo-video (stereo-BRUV). Two developing software tools have been brought together in the marine science cloud to provide marine biologists with a powerful service for image annotation. SQUIDLE+ is an online platform designed for exploration, management and annotation of georeferenced images & video data. It provides a flexible annotation framework allowing users to work with their preferred annotation schemes. We have used SQUIDLE+ to sample the habitat composition and complexity of images of the benthos collected using stereo-BRUV. GlobalArchive is designed to be a centralised repository of aquatic ecological survey data with design principles including ease of use, secure user access, flexible data import, and the collection of any sampling and image analysis information. To easily share and synthesise data we have implemented data sharing protocols, including Open Data and synthesis Collaborations, and a spatial map to explore global datasets and filter to create a synthesis. These tools in the science cloud, together with a virtual desktop analysis suite offering python and R environments offer an unprecedented capability to deliver marine biodiversity information of value to marine managers and scientists alike.
At the Creation: Chaos, Control, and Automation--Commercial Software Development for Archives.
ERIC Educational Resources Information Center
Drr, W. Theodore
1988-01-01
An approach to the design of flexible text-based management systems for archives includes tiers for repository, software, and user management systems. Each tier has four layers--objective, program, result, and interface. Traps awaiting software development companies involve the market, competition, operations, and finance. (10 references) (MES)
High-performance mass storage system for workstations
NASA Technical Reports Server (NTRS)
Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.
1993-01-01
Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).
Present status and future directions of the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Morin, Richard L.; Forbes, Glenn S.; Gehring, Dale G.; Salutz, James R.; Pavlicek, William
1991-07-01
This joint project began in 1988 and was motivated by the need to develop an alternative to the archival process in place at that time (magnetic tape) for magnetic resonance imaging and neurological computed tomography. In addition, this project was felt to be an important step in gaining the necessary clinical experience for the future implementation of various aspects of electronic imaging. The initial phase of the project was conceived and developed to prove the concept, test the fundamental components, and produce performance measurements for future work. The key functions of this phase centered on attachment of imaging equipment (GE Signa) and archival processes using a non-dedicated (institutionally supplied) local area network (LAN). Attachment of imaging equipment to the LAN was performed using commercially available devices (Ethernet, PS/2, Token Ring). Image data were converted to ACR/NEMA format with retention of the vendor specific header information. Performance measurements were encouraging and led to the design of following projects. The second phase has recently been concluded. The major features of this phase have been to greatly expand the network, put the network into clinical use, establish an efficient and useful viewing station, include diagnostic reports in the archive data, provide wide area network (WAN) capability via ISDN, and establish two-way real-time video between remote sites. This phase has heightened both departmental and institutional thought regarding various issues raised by electronic imaging. Much discussion regarding both present as well as future archival processes has occurred. The use of institutional LAN resources has proven to be adequate for the archival function examined thus far. Experiments to date have shown that use of dedicated resources will be necessary for retrieval activities at even a basic level. This report presents an overview of the background present status and future directions of the project.
Landsat View: Istanbul, Turkey
2017-12-08
Istanbul has been a bustling trade city for thousands of years. In this 1975 image, taken by Landsat, the city centers on the Golden Horn the estuary that flows into the Bosporus Straight at the center of the scene. Shown in false color, vegetation is red, urban areas are gray, and water appears black. A bridge built in 1973 to connect the Asian and European sides of Istanbul is barely visible. By 2011, Istanbul's population had exploded from 2 to 13 million people, and the city has gone through a dramatic expansion. This Landsat 5 image shows densely packed urban areas stretching along the Sea of Marmara and up the Bosporus Straight where a second bridge built in 1988 now crosses the water. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
Santiago, Chile, ranks among the world's fastest growing cities. Chile is South America's fifth largest economy with strong export and tourism markets. More than a third of Chile's population lives in Santiago as of 2009. Taken on January 9, 1985, and January 30, 2010, this pair of images from the Landsat 5 satellite illustrates the city's steady growth. The images were made with infrared and visible light (Landsat bands 4, 3, and 2) so that plant-covered land is red. Bare or sparsely vegetated land is tan, and the city is dark silver. In the fifteen years that elapsed between 1985 and 2010, the city expanded away from the Andes Mountains along spoke-like lines, which are major roads. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
A Complete Public Archive for the Einstein Imaging Proportional Counter
NASA Technical Reports Server (NTRS)
Helfand, David J.
1996-01-01
Consistent with our proposal to the Astrophysics Data Program in 1992, we have completed the design, construction, documentation, and distribution of a flexible and complete archive of the data collected by the Einstein Imaging Proportional Counter. Along with software and data delivered to the High Energy Astrophysics Science Archive Research Center at Goddard Space Flight Center, we have compiled and, where appropriate, published catalogs of point sources, soft sources, hard sources, extended sources, and transient flares detected in the database along with extensive analyses of the instrument's backgrounds and other anomalies. We include in this document a brief summary of the archive's functionality, a description of the scientific catalogs and other results, a bibliography of publications supported in whole or in part under this contract, and a list of personnel whose pre- and post-doctoral education consisted in part in participation in this project.
Data management at Biosphere 2 center
NASA Technical Reports Server (NTRS)
McCreary, Leone F.
1997-01-01
Throughout the history of Biosphere 2, the collecting and recording of biological data has been sporadic. Currently no active effort to administer and record regular biological surveys is being made. Also, there is no central location, such as an on-site data library, where all records from various studies have been archived. As a research institute, good, complete data records are at the core of all Biosphere 2's scientific endeavors. It is therefore imperative that an effective data management system be implemented within the management and research departments as soon as possible. Establishing this system would require three general phases: (1) Design/implement a new archiving/management program (including storage, cataloging and retrieval systems); (2) Organize and input baseline and intermediate data from existing archives; and (3) Maintain records by inputting new data.
Horton, M C; Lewis, T E; Kinsey, T V
1999-05-01
Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS.
Development and implementation of ultrasound picture archiving and communication system
NASA Astrophysics Data System (ADS)
Weinberg, Wolfram S.; Tessler, Franklin N.; Grant, Edward G.; Kangarloo, Hooshang; Huang, H. K.
1990-08-01
The Department of Radiological Sciences at the UCLA School of Medicine is developing an archiving and communication system (PACS) for digitized ultrasound images. In its final stage the system will involve the acquisition and archiving of ultrasound studies from four different locations including the Center for Health Sciences, the Department for Mental Health and the Outpatient Radiology and Endoscopy Departments with a total of 200-250 patient studies per week. The concept comprises two stages of image manipulation for each ultrasound work area. The first station is located close to the examination site and accomodates the acquisition of digital images from up to five ultrasound devices and provides for instantaneous display and primary viewing and image selection. Completed patient studies are transferred to a main workstation for secondary review, further analysis and comparison studies. The review station has an on-line storage capacity of 10,000 images with a resolution of 512x512 8 bit data to allow for immediate retrieval of active patient studies of up to two weeks. The main work stations are connected through the general network and use one central archive for long term storage and a film printer for hardcopy output. First phase development efforts concentrate on the implementation and testing of a system at one location consisting of a number of ultrasound units with video digitizer and network interfaces and a microcomputer workstation as host for the display station with two color monitors, each allowing simultaneous display of four 512x512 images. The discussion emphasizes functionality, performance and acceptance of the system in the clinical environment.
Resolution analysis of archive films for the purpose of their optimal digitization and distribution
NASA Astrophysics Data System (ADS)
Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek
2017-09-01
With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.
Using modern imaging techniques to old HST data: a summary of the ALICE program.
NASA Astrophysics Data System (ADS)
Choquet, Elodie; Soummer, Remi; Perrin, Marshall; Pueyo, Laurent; Hagan, James Brendan; Zimmerman, Neil; Debes, John Henry; Schneider, Glenn; Ren, Bin; Milli, Julien; Wolff, Schuyler; Stark, Chris; Mawet, Dimitri; Golimowski, David A.; Hines, Dean C.; Roberge, Aki; Serabyn, Eugene
2018-01-01
Direct imaging of extrasolar systems is a powerful technique to study the physical properties of exoplanetary systems and understand their formation and evolution mechanisms. The detection and characterization of these objects are challenged by their high contrast with their host star. Several observing strategies and post-processing algorithms have been developed for ground-based high-contrast imaging instruments, enabling the discovery of directly-imaged and spectrally-characterized exoplanets. The Hubble Space Telescope (HST), pioneer in directly imaging extrasolar systems, has yet been often limited to the detection of bright debris disks systems, with sensitivity limited by the difficulty to implement an optimal PSF subtraction stategy, which is readily offered on ground-based telescopes in pupil tracking mode.The Archival Legacy Investigations of Circumstellar Environments (ALICE) program is a consistent re-analysis of the 10 year old coronagraphic archive of HST's NICMOS infrared imager. Using post-processing methods developed for ground-based observations, we used the whole archive to calibrate PSF temporal variations and improve NICMOS's detection limits. We have now delivered ALICE-reprocessed science products for the whole NICMOS archival data back to the community. These science products, as well as the ALICE pipeline, were used to prototype the JWST coronagraphic data and reduction pipeline. The ALICE program has enabled the detection of 10 faint debris disk systems never imaged before in the near-infrared and several substellar companion candidates, which we are all in the process of characterizing through follow-up observations with both ground-based facilities and HST-STIS coronagraphy. In this publication, we provide a summary of the results of the ALICE program, advertise its science products and discuss the prospects of the program.
NASA Astrophysics Data System (ADS)
Williams, D. A.; Nelson, D. M.
2017-12-01
A portion of the earth analog image archive at the Ronald Greeley Center for Planetary Studies (RGCPS)-the NASA Regional Planetary Information Facility at Arizona State University-is being digitized and will be added to the Planetary Data System (PDS) for public use. This will be a first addition of terrestrial data to the PDS specifically for comparative planetology studies. Digitization is separated into four tasks. First is the scanning of aerial photographs of volcanic and aeolian structures and flows. The second task is to scan field site images taken from ground and low-altitude aircraft of volcanic structures, lava flows, lava tubes, dunes, and wind streaks. The third image set to be scanned includes photographs of lab experiments from the NASA Planetary Aeolian Laboratory wind tunnels, vortex generator, and of wax models. Finally, rare NASA documents are being scanned and formatted as PDF files. Thousands of images are to be scanned for this project. Archiving of the data will follow the PDS4 standard, where the entire project is classified as a single bundle, with individual subjects (i.e., the Amboy Crater volcanic structure in the Mojave Desert of California) as collections. Within the collections, each image is considered a product, with a unique ID and associated XML document. Documents describing the image data, including the subject and context, will be included with each collection. Once complete, the data will be hosted by a PDS data node and available for public search and download. As one of the first earth analog datasets to be archived by the PDS, this project could prompt the digitizing and making available of historic datasets from other facilities for the scientific community.
NASA Technical Reports Server (NTRS)
White, Nicholas (Technical Monitor); Murray, Stephen S.
2003-01-01
(1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year's HEADCC meeting was to make the ds9 image display program the primary vehicle for displaying line graphics (as well as images). The first step required to make this possible was to enhance the line graphics capabilities of ds9. SAO therefore spent considerable effort upgrading ds9 to use Tcl 8.4 so that the BLT line graphics package could be built and imported into ds9 from source code, rather than from a pre-built (and generally outdated) shared library. This task, which is nearly complete, allows us to extend BLT as needed for the HEAD community. Following HEADCC discussion concerning archiving and the display of archived data, we extended ds9 to support full access to many astronomical Web-based archives sites, including HEASARC, MAST, CHANDRA, SKYVIEW, ADS, NED, SIMBAD, IRAS, NVRO, SAO TDC, and FIRST. Using ds9's new internal Web access capabilities, these archives can be accessed via their Web page. FITS images, plots, spectra, and journal abstracts can be referenced, down-loaded, and displayed directly and easily in ds9. For more information, see: http://hea-www.harvard.edu/saord/ds9. Also after the HEADCC discussion concerning region filtering, we extended the Funtools sample implementation of region filtering as described in: http://hea-www.harvard.edu/saord/funtools/regions.html. In particular, we added several new composite regions for event and image filtering, including elliptical and box annuli. We also extended the panda (Pie AND Annulus) region support to include box pandas and elliptical pandas. These new composite regions are especially useful in programs that need to count photons in each separate region using only a single pass through the data. Support for these new regions was added to ds9. In the same vein, we developed new region support for filtering images using simple FITS image masks, i.e. 8-bit or 16-bit FITS images where the value of a pixel is the region id number for that pixel. Other important enhancements to DS9 this year, include supporor multiple world coordinate systems, three dimensional event file binning, image smoothing, region groups and tags, the ability to save images in a number of image formats (such as JPEG, TIFF, PNG, FITS), improvements in support for integrating external analysis tools, and support for the virtual observatory. In particular, a full-featured web browser has been implemented within D S 9 . This provides support for full access to HEASARC archive sites such as SKYVIEW and W3BROWSE, in addition to other astronomical archives sites such as MAST, CHANDRA, ADS, NED, SIMBAD, IRAS, NVRO, SA0 TDC, and FIRST. From within DS9, the archives can be searched, and FITS images, plots, spectra, and journal abstracts can be referenced, downloaded and displayed The web browser provides the basis for the built-in help facility. All DS9 documentation, including the reference manual, FAQ, Know Features, and contact information is now available to the user without the need for external display applications. New versions of DS9 maybe downloaded and installed using this facility. Two important features used in the analysis of high energy astronomical data have been implemented in the past year. The first is support for binning photon event data in three dimensions. By binning the third dimension in time or energy, users are easily able to detect variable x-ray sources and identify other physical properties of their data. Second, a number of fast smoothing algorithms have been implemented in DS9, which allow users to smooth their data in real time. Algorithms for boxcar, tophat, and gaussian smoothing are supported.
Upper Klamath Basin Landsat Image for June 24, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-7 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-7 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-7 on April 15, 1999 marks the addition of the latest satellite to the Landsat series. The Landsat-7 satellite carries the Enhanced Thematic Mapper Plus (ETM+) sensor. A mechanical failure of the ETM+ Scan Line Corrector (SLC) occurred on May 31, 2003, with the result that all Landsat 7 scenes acquired from July 14, 2003 to present have been collected in 'SLC-off' mode. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
Upper Klamath Basin Landsat Image for July 10, 2006: Path 44 Row 31
Snyder, Daniel T.
2012-01-01
This subset of a Landsat-7 image shows part of the upper Klamath Basin. The original images were obtained from the U.S. Geological Survey Earth Resources Observation and Science Center (EROS). EROS is responsible for archive management and distribution of Landsat data products. The Landsat-7 satellite is part of an ongoing mission to provide quality remote sensing data in support of research and applications activities. The launch of Landsat-7 on April 15, 1999 marks the addition of the latest satellite to the Landsat series. The Landsat-7 satellite carries the Enhanced Thematic Mapper Plus (ETM+) sensor. A mechanical failure of the ETM+ Scan Line Corrector (SLC) occurred on May 31, 2003, with the result that all Landsat 7 scenes acquired from July 14, 2003 to present have been collected in 'SLC-off' mode. More information on the Landsat program can be found online at http://landsat.usgs.gov/.
LBT Distributed Archive: Status and Features
NASA Astrophysics Data System (ADS)
Knapic, C.; Smareglia, R.; Thompson, D.; Grede, G.
2011-07-01
After the first release of the LBT Distributed Archive, this successful collaboration is continuing within the LBT corporation. The IA2 (Italian Center for Astronomical Archive) team had updated the LBT DA with new features in order to facilitate user data retrieval while abiding by VO standards. To facilitate the integration of data from any new instruments, we have migrated to a new database, developed new data distribution software, and enhanced features in the LBT User Interface. The DBMS engine has been changed to MySQL. Consequently, the data handling software now uses java thread technology to update and synchronize the main storage archives on Mt. Graham and in Tucson, as well as archives in Trieste and Heidelberg, with all metadata and proprietary data. The LBT UI has been updated with additional features allowing users to search by instrument and some of the more important characteristics of the images. Finally, instead of a simple cone search service over all LBT image data, new instrument specific SIAP and cone search services have been developed. They will be published in the IVOA framework later this fall.
Planetary image conversion task
NASA Technical Reports Server (NTRS)
Martin, M. D.; Stanley, C. L.; Laughlin, G.
1985-01-01
The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.
NASA Astrophysics Data System (ADS)
Erberich, Stephan G.; Hoppe, Martin; Jansen, Christian; Schmidt, Thomas; Thron, Armin; Oberschelp, Walter
2001-08-01
In the last few years more and more University Hospitals as well as private hospitals changed to digital information systems for patient record, diagnostic files and digital images. Not only that patient management becomes easier, it is also very remarkable how clinical research can profit from Picture Archiving and Communication Systems (PACS) and diagnostic databases, especially from image databases. Since images are available on the finger tip, difficulties arise when image data needs to be processed, e.g. segmented, classified or co-registered, which usually demands a lot computational power. Today's clinical environment does support PACS very well, but real image processing is still under-developed. The purpose of this paper is to introduce a parallel cluster of standard distributed systems and its software components and how such a system can be integrated into a hospital environment. To demonstrate the cluster technique we present our clinical experience with the crucial but cost-intensive motion correction of clinical routine and research functional MRI (fMRI) data, as it is processed in our Lab on a daily basis.
Breast Histopathological Image Retrieval Based on Latent Dirichlet Allocation.
Ma, Yibing; Jiang, Zhiguo; Zhang, Haopeng; Xie, Fengying; Zheng, Yushan; Shi, Huaqiang; Zhao, Yu
2017-07-01
In the field of pathology, whole slide image (WSI) has become the major carrier of visual and diagnostic information. Content-based image retrieval among WSIs can aid the diagnosis of an unknown pathological image by finding its similar regions in WSIs with diagnostic information. However, the huge size and complex content of WSI pose several challenges for retrieval. In this paper, we propose an unsupervised, accurate, and fast retrieval method for a breast histopathological image. Specifically, the method presents a local statistical feature of nuclei for morphology and distribution of nuclei, and employs the Gabor feature to describe the texture information. The latent Dirichlet allocation model is utilized for high-level semantic mining. Locality-sensitive hashing is used to speed up the search. Experiments on a WSI database with more than 8000 images from 15 types of breast histopathology demonstrate that our method achieves about 0.9 retrieval precision as well as promising efficiency. Based on the proposed framework, we are developing a search engine for an online digital slide browsing and retrieval platform, which can be applied in computer-aided diagnosis, pathology education, and WSI archiving and management.
A Framework for Integration of Heterogeneous Medical Imaging Networks
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021
A framework for integration of heterogeneous medical imaging networks.
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.
JPL, NASA and the Historical Record: Key Events/Documents in Lunar and Mars Exploration
NASA Technical Reports Server (NTRS)
Hooks, Michael Q.
1999-01-01
This document represents a presentation about the Jet Propulsion Laboratory (JPL) historical archives in the area of Lunar and Martian Exploration. The JPL archives documents the history of JPL's flight projects, research and development activities and administrative operations. The archives are in a variety of format. The presentation reviews the information available through the JPL archives web site, information available through the Regional Planetary Image Facility web site, and the information on past missions available through the web sites. The presentation also reviews the NASA historical resources at the NASA History Office and the National Archives and Records Administration.
An "Academic" Dilemma: The Tale of Archives and Records Management
ERIC Educational Resources Information Center
Shepherd, Elizabeth
2012-01-01
This article discusses the development of academic research in the archives and records management field. It is argued that the field has faced a dilemma between educating graduates for work in a professional domain and developing robust research methods and frameworks for the emerging academic discipline. The article reports on some projects…
Study on Integrated Pest Management for Libraries and Archives.
ERIC Educational Resources Information Center
Parker, Thomas A.
This study addresses the problems caused by the major insect and rodent pests and molds and mildews in libraries and archives; the damage they do to collections; and techniques for their prevention and control. Guidelines are also provided for the development and initiation of an Integrated Pest Management program for facilities housing library…
International Reader in the Management of Library, Information and Archive Services.
ERIC Educational Resources Information Center
Vaughan, Anthony, Comp.
Compiled for the benefit of library, archive, and information science schools for use in their information services and systems management courses, this reader is not meant to replace the already existing standard textbooks on this subject, but to provide a more international perspective than textbooks written with the information services of one…
BATSE imaging survey of the Galactic plane
NASA Technical Reports Server (NTRS)
Grindlay, J. E.; Barret, D.; Bloser, P. F.; Zhang, S. N.; Robinson, C.; Harmon, B. A.
1997-01-01
The burst and transient source experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO) provides all sky monitoring capability, occultation analysis and occultation imaging which enables new and fainter sources to be searched for in relatively crowded fields. The occultation imaging technique is used in combination with an automated BATSE image scanner, allowing an analysis of large data sets of occultation images for detections of candidate sources and for the construction of source catalogs and data bases. This automated image scanner system is being tested on archival data in order to optimize the search and detection thresholds. The image search system, its calibration results and preliminary survey results on archival data are reported on. The aim of the survey is to identify a complete sample of black hole candidates in the galaxy and constrain the number of black hole systems and neutron star systems.
Building a Digital Library for Multibeam Data, Images and Documents
NASA Astrophysics Data System (ADS)
Miller, S. P.; Staudigel, H.; Koppers, A.; Johnson, C.; Cande, S.; Sandwell, D.; Peckman, U.; Becker, J. J.; Helly, J.; Zaslavsky, I.; Schottlaender, B. E.; Starr, S.; Montoya, G.
2001-12-01
The Scripps Institution of Oceanography, the UCSD Libraries and the San Diego Supercomputing Center have joined forces to establish a digital library for accessing a wide range of multibeam and marine geophysical data, to a community that ranges from the MGG researcher to K-12 outreach clients. This digital library collection will include 233 multibeam cruises with grids, plots, photographs, station data, technical reports, planning documents and publications, drawn from the holdings of the Geological Data Center and the SIO Archives. Inquiries will be made through an Ocean Exploration Console, reminiscent of a cockpit display where a multitude of data may be displayed individually or in two or three-dimensional projections. These displays will provide access to cruise data as well as global databases such as Global Topography, crustal age, and sediment thickness, thus meeting the day-to-day needs of researchers as well as educators, students, and the public. The prototype contains a few selected expeditions, and a review of the initial approach will be solicited from the user community during the poster session. The search process can be focused by a variety of constraints: geospatial (lat-lon box), temporal (e.g., since 1996), keyword (e.g., cruise, place name, PI, etc.), or expert-level (e.g., K-6 or researcher). The Storage Resource Broker (SRB) software from the SDSC manages the evolving collection as a series of distributed but related archives in various media, from shipboard data through processing and final archiving. The latest version of MB-System provides for the systematic creation of standard metadata, and for the harvesting of metadata from multibeam files. Automated scripts will be used to load the metadata catalog to enable queries with an Oracle database management system. These new efforts to bridge the gap between libraries and data archives are supported by the NSF Information Technology and National Science Digital Library (NSDL) programs, augmented by UC funds, and closely coordinated with Digital Library for Earth System Education (DLESE) activities.
The Archival Appraisal of Photographs: A RAMP Study with Guidelines.
ERIC Educational Resources Information Center
Leary, William H.
Prepared for Unesco's Records and Archives Management Programme (RAMP), this study is designed to provide archivists, manuscript and museum curators, and other interested information professionals in both industrialized and developing countries with an understanding of the archival character of photographs, and a set of guidelines for the…
The American Archival Profession and Information Technology Standards.
ERIC Educational Resources Information Center
Cox, Richard J.
1992-01-01
Discussion of the use of standards by archivists highlights the U.S. MARC AMC (Archives-Manuscript Control) format for reporting archival records and manuscripts; their interest in specific standards being developed for the OSI (Open Systems Interconnection) reference model; and the management of records in electronic formats. (16 references) (LAE)
NASA Astrophysics Data System (ADS)
Nass, A.; D'Amore, M.; Helbert, J.
2018-04-01
An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.
Another New Frontier: Archives and Manuscripts in the National Park Service.
ERIC Educational Resources Information Center
Bowling, Mary B.
1985-01-01
Archival collections of Edison, Olmsted, Morristown, and Longfellow National Historic Sites offer examples of how documentary collections have been handled in the past, and of ways in which National Park Service is beginning to address cultural resource management issues (arrangement, preservation, cataloging, research use) of archives and…
NASA Astrophysics Data System (ADS)
Shih, D.-T.; Lin, C. L.; Tseng, C.-Y.
2015-08-01
This paper presents an interdisciplinary to develop content-aware application that combines game with learning on specific categories of digital archives. The employment of content-oriented game enhances the gamification and efficacy of learning in culture education on architectures and history of Hsinchu County, Taiwan. The gamified form of the application is used as a backbone to support and provide a strong stimulation to engage users in learning art and culture, therefore this research is implementing under the goal of "The Digital ARt/ARchitecture Project". The purpose of the abovementioned project is to develop interactive serious game approaches and applications for Hsinchu County historical archives and architectures. Therefore, we present two applications, "3D AR for Hukou Old " and "Hsinchu County History Museum AR Tour" which are in form of augmented reality (AR). By using AR imaging techniques to blend real object and virtual content, the users can immerse in virtual exhibitions of Hukou Old Street and Hsinchu County History Museum, and to learn in ubiquitous computing environment. This paper proposes a content system that includes tools and materials used to create representations of digitized cultural archives including historical artifacts, documents, customs, religion, and architectures. The Digital ARt / ARchitecture Project is based on the concept of serious game and consists of three aspects: content creation, target management, and AR presentation. The project focuses on developing a proper approach to serve as an interactive game, and to offer a learning opportunity for appreciating historic architectures by playing AR cards. Furthermore, the card game aims to provide multi-faceted understanding and learning experience to help user learning through 3D objects, hyperlinked web data, and the manipulation of learning mode, and then effectively developing their learning levels on cultural and historical archives in Hsinchu County.
The Next Generation of HLA Image Products
NASA Astrophysics Data System (ADS)
Gaffney, N. I.; Casertano, S.; Ferguson, B.
2012-09-01
We present the re-engineered pipeline based on existing and improved algorithms with the aim of improving processing quality, cross-instrument portability, data flow management, and software maintenance. The Hubble Legacy Archive (HLA) is a project to add value to the Hubble Space Telescope data archive by producing and delivering science-ready drizzled data products and source lists derived from these products. Initially, ACS, NICMOS, and WFCP2 data were combined using instrument-specific pipelines based on scripts developed to process the ACS GOODS data and a separate set of scripts to generate source extractor and DAOPhot source lists. The new pipeline, initially designed for WFC3 data, isolates instrument-specific processing and is easily extendable to other instruments and to generating wide-area mosaics. Significant improvements have been made in image combination using improved alignment, source detection, and background equalization routines. It integrates improved alignment procedures, better noise model, and source list generation within a single code base. Wherever practical, PyRAF based routines have been replaced with non-IRAF based python libraries (e.g. NumPy and PyFITS). The data formats have been modified to handle better and more consistent propagation of information from individual exposures to the combined products. A new exposure layer stores the effective exposure time for each pixel in the sky which is key in properly interpreting combined images from diverse data that were not initially planned to be mosaiced. We worked to improve the validity of the metadata within our FITS headers for these products relative to standard IRAF/PyRAF processing. Any keywords that pertain to individual exposures have been removed from the primary and extension headers and placed in a table extension for more direct and efficient perusal. This mechanism also allows for more detailed information on the processing of individual images to be stored and propagated providing a more hierarchical metadata storage system than key value pair FITS headers provide. In this poster we will discuss the changes to the pipeline processing and source list generation and the lessons learned which may be applicable to other archive projects as well as discuss our new metadata curation and preservation process.
The Kanzelhöhe Online Data Archive
NASA Astrophysics Data System (ADS)
Pötzi, W.; Hirtenfellner-Polanec, W.; Temmer, M.
The Kanzelhöhe Observatory provides high-cadence full-disk observations of solar activity phenomena like sunspots, flares and prominence eruptions on a regular basis. The data are available for download from the KODA (Kanzelhöhe Observatory Data Archive) which is freely accessible. The archive offers sunspot drawings back to 1950 and high cadence H-α data back to 1973. Images from other instruments, like white-light and CaIIK, are available since 2007 and 2010, respectively. In the following we describe how to access the archive and the format of the data.
NASA Astrophysics Data System (ADS)
McInnes, B.; Brown, A.; Liffers, M.
2015-12-01
Publically funded laboratories have a responsibility to generate, archive and disseminate analytical data to the research community. Laboratory managers know however, that a long tail of analytical effort never escapes researchers' thumb drives once they leave the lab. This work reports on a research data management project (Digital Mineralogy Library) where integrated hardware and software systems automatically archive and deliver analytical data and metadata to institutional and community data portals. The scientific objective of the DML project was to quantify the modal abundance of heavy minerals extracted from key lithological units in Western Australia. The selected analytical platform was a TESCAN Integrated Mineral Analyser (TIMA) that uses EDS-based mineral classification software to image and quantify mineral abundance and grain size at micron scale resolution. The analytical workflow used a bespoke laboratory information management system (LIMS) to orchestrate: (1) the preparation of grain mounts with embedded QR codes that serve as enduring links between physical samples and analytical data, (2) the assignment of an International Geo Sample Number (IGSN) and Digital Object Identifier (DOI) to each grain mount via the System for Earth Sample Registry (SESAR), (3) the assignment of a DOI to instrument metadata via Research Data Australia, (4) the delivery of TIMA analytical outputs, including spatially registered mineralogy images and mineral abundance data, to an institutionally-based data management server, and (5) the downstream delivery of a final data product via a Google Maps interface such as the AuScope Discovery Portal. The modular design of the system permits the networking of multiple instruments within a single site or multiple collaborating research institutions. Although sharing analytical data does provide new opportunities for the geochemistry community, the creation of an open data network requires: (1) adopting open data reporting standards and conventions, (2) requiring instrument manufacturers and software developers to deliver and process data in formats compatible with open standards, and (3) public funding agencies to incentivise researchers, laboratories and institutions to make their data open and accessible to consumers.
NASA Astrophysics Data System (ADS)
Matgen, Patrick; Giustarini, Laura; Hostache, Renaud
2012-10-01
This paper introduces an automatic flood mapping application that is hosted on the Grid Processing on Demand (GPOD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver operationally flooded areas using both recent and historical acquisitions of SAR data. Having as a short-term target the flooding-related exploitation of data generated by the upcoming ESA SENTINEL-1 SAR mission, the flood mapping application consists of two building blocks: i) a set of query tools for selecting the "crisis image" and the optimal corresponding "reference image" from the G-POD archive and ii) an algorithm for extracting flooded areas via change detection using the previously selected "crisis image" and "reference image". Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate reference image. Potential users will also be able to apply the implemented flood delineation algorithm. The latter combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. Both algorithms are computationally efficient and operate with minimum data requirements. The case study of the high magnitude flooding event that occurred in July 2007 on the Severn River, UK, and that was observed with a moderateresolution SAR sensor as well as airborne photography highlights the performance of the proposed online application. The flood mapping application on G-POD can be used sporadically, i.e. whenever a major flood event occurs and there is a demand for SAR-based flood extent maps. In the long term, a potential extension of the application could consist in systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis.
Landsat: A global land-observing program
,
2005-01-01
Landsat represents the world’s longest continuously acquired collection of space-based land remote sensing data. The Landsat Project is a joint initiative of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA) designed to gather Earth resource data from space. NASA developed and launched the spacecrafts, while the USGS handles the operations, maintenance, and management of all ground data reception, processing, archiving, product generation, and distribution.Landsat satellites have been collecting images of the Earth’s surface for more than thirty years. Landsat’s Global Survey Mission is to repeatedly capture images of the Earth’s land mass, coastal boundaries, and coral reefs, and to ensure that sufficient data are acquired to support the observation of changes on the Earth’s land surface and surrounding environment. NASA launched the first Landsat satellite in 1972, and the most recent one, Landsat 7, in 1999. Landsats 5 and 7 continue to capture hundreds of additional images of the Earth’s surface each day. These images provide a valuable resource for people who work
Design and applications of a multimodality image data warehouse framework.
Wong, Stephen T C; Hoo, Kent Soo; Knowlton, Robert C; Laxer, Kenneth D; Cao, Xinhau; Hawkins, Randall A; Dillon, William P; Arenson, Ronald L
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications--namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains.
Design and Applications of a Multimodality Image Data Warehouse Framework
Wong, Stephen T.C.; Hoo, Kent Soo; Knowlton, Robert C.; Laxer, Kenneth D.; Cao, Xinhau; Hawkins, Randall A.; Dillon, William P.; Arenson, Ronald L.
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications—namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains. PMID:11971885
You Can See Film through Digital: A Report from Where the Archiving of Motion Picture Film Stands
NASA Astrophysics Data System (ADS)
Tochigi, Akira
In recent years, digital technology has brought drastic change to the archiving of motion picture film. By collecting digital media as well as film, many conventional film archives have transformed themselves into moving image archives or audiovisual archives. As well, digital technology has expanded the possibility of the restoration of motion picture film in comparison with conventional photochemical (analog) restoration. This paper first redefines some fundamental terms regarding the archiving of motion picture film and discusses the conditions which need consideration for film archiving in Japan. With a few examples of the recent restoration projects conducted by National Film Center of the National Museum of Modern Art, Tokyo, this paper then clarifies new challenges inherent in digital restoration and urges the importance of better appreciation of motion picture film.
Rahman, Mahabubur; Watabe, Hiroshi
2018-05-01
Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Service-Based Extensions to an OAIS Archive for Science Data Management
NASA Astrophysics Data System (ADS)
Flathers, E.; Seamon, E.; Gessler, P. E.
2014-12-01
With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.
Building a COTS archive for satellite data
NASA Technical Reports Server (NTRS)
Singer, Ken; Terril, Dave; Kelly, Jack; Nichols, Cathy
1994-01-01
The goal of the NOAA/NESDIS Active Archive was to provide a method of access to an online archive of satellite data. The archive had to manage and store the data, let users interrogate the archive, and allow users to retrieve data from the archive. Practical issues of the system design such as implementation time, cost and operational support were examined in addition to the technical issues. There was a fixed window of opportunity to create an operational system, along with budget and staffing constraints. Therefore, the technical solution had to be designed and implemented subject to constraint imposed by the practical issues. The NOAA/NESDIS Active Archive came online in July of 1994, meeting all of its original objectives.
The Cancer Digital Slide Archive - TCGA
Dr. David Gutman and Dr. Lee Cooper developed The Cancer Digital Slide Archive (CDSA), a web platform for accessing pathology slide images of TCGA samples. Find out how they did it and how to use the CDSA website in this Case Study.
Current status of the joint Mayo Clinic-IBM PACS project
NASA Astrophysics Data System (ADS)
Hangiandreou, Nicholas J.; Williamson, Byrn, Jr.; Gehring, Dale G.; Persons, Kenneth R.; Reardon, Frank J.; Salutz, James R.; Felmlee, Joel P.; Loewen, M. D.; Forbes, Glenn S.
1994-05-01
A multi-phase collaboration between Mayo Clinic and IBM-Rochester was undertaken, with the goal of developing a picture archiving and communication system for routine clinical use in the Radiology Department. The initial phase of this project (phase 0) was started in 1988. The current system has been fully integrated into the clinical practice and, to date, over 6.5 million images from 16 imaging modalities have been archived. Phase 3 of this project has recently concluded.
Cost-effective data storage/archival subsystem for functional PACS
NASA Astrophysics Data System (ADS)
Chen, Y. P.; Kim, Yongmin
1993-09-01
Not the least of the requirements of a workable PACS is the ability to store and archive vast amounts of information. A medium-size hospital will generate between 1 and 2 TBytes of data annually on a fully functional PACS. A high-speed image transmission network coupled with a comparably high-speed central data storage unit can make local memory and magnetic disks in the PACS workstations less critical and, in an extreme case, unnecessary. Under these circumstances, the capacity and performance of the central data storage subsystem and database is critical in determining the response time at the workstations, thus significantly affecting clinical acceptability. The central data storage subsystem not only needs to provide sufficient capacity to store about ten days worth of images (five days worth of new studies, and on the average, about one comparison study for each new study), but also supplies images to the requesting workstation in a timely fashion. The database must provide fast retrieval responses upon users' requests for images. This paper analyzes both advantages and disadvantages of multiple parallel transfer disks versus RAID disks for short-term central data storage subsystem, as well as optical disk jukebox versus digital recorder tape subsystem for long-term archive. Furthermore, an example high-performance cost-effective storage subsystem which integrates both the RAID disks and high-speed digital tape subsystem as a cost-effective PACS data storage/archival unit are presented.
The Small Bodies Imager Browser --- finding asteroid and comet images without pain
NASA Astrophysics Data System (ADS)
Palmer, E.; Sykes, M.; Davis, D.; Neese, C.
2014-07-01
To facilitate accessing and downloading spatially resolved imagery of asteroids and comets in the NASA Planetary Data System (PDS), we have created the Small Bodies Image Browser. It is a HTML5 webpage that runs inside a standard web browser needing no installation (http://sbn.psi.edu/sbib/). The volume of data returned by spacecraft missions has grown substantially over the last decade. While this wealth of data provides scientists with ample support for research, it has greatly increased the difficulty of managing, accessing and processing these data. Further, the complexity necessary for a long-term archive results in an architecture that is efficient for computers, but not user friendly. The Small Bodies Image Browser (SBIB) is tied into the PDS archive of the Small Bodies Asteroid Subnode hosted at the Planetary Science Institute [1]. Currently, the tool contains the entire repository of the Dawn mission's encounter with Vesta [2], and we will be adding other datasets in the future. For Vesta, this includes both the level 1A and 1B images for the Framing Camera (FC) and the level 1B spectral cubes from the Visual and Infrared (VIR) spectrometer, providing over 30,000 individual images. A key strength of the tool is providing quick and easy access of these data. The tool allows for searches based on clicking on a map or typing in coordinates. The SBIB can show an entire mission phase (such as cycle 7 of the Low Altitude Mapping Orbit) and the associated footprints, as well as search by image name. It can focus the search by mission phase, resolution or instrument. Imagery archived in the PDS are generally provided by missions in a single or narrow range of formats. To enhance the value and usability of this data to researchers, SBIB makes these available in these original formats as well as PNG, JPEG and ArcGIS compatible ISIS cubes [3]. Additionally, we provide header files for the VIR cubes so they can be read into ENVI without additional processing. Finally, we also provide both camera-based and map-projected products with geometric data embedded for use within ArcGIS and ISIS. We use the Gaskell shape model for terrain projections [4]. There are several other outstanding data analysis tools that have access to asteroid and comet data: JAsteroid (a derivative of JMARS [5]) and the Applied Physics Laboratory's Small Body Mapping Tool [6]. The SBIB has specifically focused on providing data in the easiest manner possible rather than trying to be an analytical tool.
Minati, L; Ghielmetti, F; Ciobanu, V; D'Incerti, L; Maccagnano, C; Bizzi, A; Bruzzone, M G
2007-03-01
Advanced neuroimaging techniques, such as functional magnetic resonance imaging (fMRI), chemical shift spectroscopy imaging (CSI), diffusion tensor imaging (DTI), and perfusion-weighted imaging (PWI) create novel challenges in terms of data storage and management: huge amounts of raw data are generated, the results of analysis may depend on the software and settings that have been used, and most often intermediate files are inherently not compliant with the current DICOM (digital imaging and communication in medicine) standard, as they contain multidimensional complex and tensor arrays and various other types of data structures. A software architecture, referred to as Bio-Image Warehouse System (BIWS), which can be used alongside a radiology information system/picture archiving and communication system (RIS/PACS) system to store neuroimaging data for research purposes, is presented. The system architecture is conceived with the purpose of enabling to query by diagnosis according to a predefined two-layered classification taxonomy. The operational impact of the system and the time needed to get acquainted with the web-based interface and with the taxonomy are found to be limited. The development of modules enabling automated creation of statistical templates is proposed.
Fingerprint verification on medical image reporting system.
Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah
2008-03-01
The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties.
Understanding MRI: basic MR physics for physicians.
Currie, Stuart; Hoggard, Nigel; Craven, Ian J; Hadjivassiliou, Marios; Wilkinson, Iain D
2013-04-01
More frequently hospital clinicians are reviewing images from MR studies of their patients before seeking formal radiological opinion. This practice is driven by a multitude of factors, including an increased demand placed on hospital services, the wide availability of the picture archiving and communication system, time pressures for patient treatment (eg, in the management of acute stroke) and an inherent desire for the clinician to learn. Knowledge of the basic physical principles behind MRI is essential for correct image interpretation. This article, written for the general hospital physician, describes the basic physics of MRI taking into account the machinery, contrast weighting, spin- and gradient-echo techniques and pertinent safety issues. Examples provided are primarily referenced to neuroradiology reflecting the subspecialty for which MR currently has the greatest clinical application.
Archive and records management-Fiscal year 2010 offline archive media trade study
Bodoh, Tom; Boettcher, Ken; Gacke, Ken; Greenhagen, Cheryl; Engelbrecht, Al
2010-01-01
This document is a trade study comparing offline digital archive storage technologies. The document compares and assesses several technologies and recommends which technologies could be deployed as the next generation standard for the U.S. Geological Survey (USGS). Archives must regularly migrate to the next generation of digital archive technology, and the technology selected must maintain data integrity until the next migration. This document is the fiscal year 2010 (FY10) revision of a study completed in FY01 and revised in FY03, FY04, FY06, and FY08.
Steckel, R J; Batra, P; Johnson, S; Sayre, J; Brown, K; Haker, K; Young, D; Zucker, M
1995-04-01
This study was to determine whether different digital display formats for portable chest radiographs of coronary care unit patients would provide comparable information for clinical care. In particular, we tried to ascertain whether 1024 x 1024 pixel (1K) images on a picture archiving and communication system (PACS) workstation would be comparable to 1760 x 2140 pixel (2K) images on workstations or to digital films. If comparability could be proved, we hypothesized that 1K workstations could considerably lower equipment and film costs and facilitate image transmission from point to point. Four chest radiologists read a panel of chest studies assembled from 98 coronary care unit patients, comparing 1K and 2K soft-copy images with digital hard copies. For all three image types for the 98 patients, the readers evaluated nine image parameters that the cardiologists deemed essential for clinical decision making. Two other chest radiologists reviewed each patient's three image types, historical chest images, current and prior radiologic reports, and medical record to determine the consensus, or "truth findings." With one exception (small pleural effusions), the receiver operating characteristic analysis showed no significant differences in the clinical information derived from the three image types. For clinical management in a coronary care unit, comparable information can be obtained from digital radiologic chest studies using a 1K x 1K soft-copy format, a 2K x 2K soft-copy format, or a hard copy (film). Substantial savings in cost and time are therefore possible by using soft-copy images and lower resolution (1K x 1K) workstations and, when necessary, by transmitting images on regular telephone lines.
Selected Guidelines for the Management of Records and Archives: A RAMP Reader.
ERIC Educational Resources Information Center
Walne, Peter, Comp.
The guidelines contained in this book are taken from studies published by UNESCO's Records and Archives Management Program (RAMP) between 1981 and 1987. Each set of guidelines is accompanied by an introduction to provide chronological or methodological context. The guidelines are titled as follows: (1) "The Use of Sampling Techniques in the…
Comprehensive planning of data archive in Japanese planetary missions
NASA Astrophysics Data System (ADS)
Yamamoto, Yukio; Shinohara, Iku; Hoshino, Hirokazu; Tateno, Naoki; Hareyama, Makoto; Okada, Naoki; Ebisawa, Ken
Comprehensive planning of data archive in Japanese planetary missions Japan Aerospace Exploration Agency (JAXA) provides HAYABUSA and KAGUYA data as planetary data archives. These data archives, however, were prepared independently. Therefore the inconsistency of data format has occurred, and the knowledge of data archiving activity is not inherited. Recently, the discussion of comprehensive planning of data archive has started to prepare up-coming planetary missions, which indicates the comprehensive plan of data archive is required in several steps. The framework of the comprehensive plan is divided into four items: Preparation, Evaluation, Preservation, and Service. 1. PREPARATION FRAMEWORK Data is classified into several types: raw data, level-0, 1, 2 processing data, ancillary data, and etc. The task of mission data preparation is responsible for instrument teams, but preparations beside mission data and support of data management are essential to make unified conventions and formats over instruments in a mission, and over missions. 2. EVALUATION FRAMEWORK There are two meanings of evaluation: format and quality. The format evaluation is often discussed in the preparation framework. The data quality evaluation which is often called quality assurance (QA) or quality control (QC) must be performed by third party apart from preparation teams. An instrument team has the initiative for the preparation itself, and the third-party group is organized to evaluate the instrument team's activity. 3. PRESERVATION FRAMEWORK The main topic of this framework is document management, archiving structure, and simple access method. The mission produces many documents in the process of the development. Instrument de-velopment is no exception. During long-term development of a mission, many documents are obsoleted and updated repeatedly. A smart system will help instrument team to reduce some troubles of document management and archiving task. JAXA attempts to follow PDS manners to do this management since PDS has highly sophisticated archiving structure. In addition, the access method to archived data must be simple and standard well over a decade. 4. SERVICE FRAMEWORK The service framework including planetary data access protocol, PDAP, has been developed to share a stored data effectively. The sophisticated service framework will work not only for publication data, but also for low-level data. JAXA's data query services is under developed based on PDAP, which means that the low-level data can be published in the same manner as level 2 data. In this presentation, we report the detail structure of these four frameworks adopting upcoming Planet-C, Venus Climate Orbiter, mission.
NASA Astrophysics Data System (ADS)
Grussenmeyer, P.; Khalil, O. Al
2017-08-01
The paper presents photogrammetric archives from Aleppo (Syria), collected between 1999 and 2002 by the Committee for maintenance and restoration of the Great Mosque in partnership with the Engineering Unit of the University of Aleppo. During that period, terrestrial photogrammetric data and geodetic surveys of the Great Omayyad mosque were recorded for documentation purposes and geotechnical studies. During the recent war in Syria, the Mosque has unfortunately been seriously damaged and its minaret has been completely destroyed. The paper presents a summary of the documentation available from the past projects as well as solutions of 3D reconstruction based on the processing of the photogrammetric archives with the latest 3D image-based techniques.
The Post-Soviet Archives: Organization, Access, and Declassification
1993-01-01
attempting to place these files under Poskeirkhlv but has had limited success . The successors to the 503-the Ninistry of Security and the Foreign...their transfer to Roakomazkbiv. Pikhoia was able to take over these archives with some success ; yet, comp~lete control over the RO= archives has alluded...key 1Mironenko interview, May 27, 1992. - 20 - players are involved in the management of the Russian Presidential Archive. First, the director of the
The UK National Arts Education Archive: Ideas and Imaginings
ERIC Educational Resources Information Center
Adams, Jeff; Bailey, Rowan; Walton, Neil
2017-01-01
The National Arts Education Archive (NAEA) is housed and maintained by the Yorkshire Sculpture Park (YSP), and managed by YSP coordinators and educators with a well-established volunteer programme. This year, 2017, as part of the celebrations of the YSP's 40th anniversary, the Archive will hold its own exhibition entitled "Treasures…
ERIC Educational Resources Information Center
Wimalaratne, K. D. G.
This long-term Records and Archives Administration Programme (RAMP) study is designed to assist archivists, records managers, and information specialists in identifying for current use and possible archival selection those transactional or case files that contain scientific and technical information (STI), particularly in those instances where…
NASA Astrophysics Data System (ADS)
Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.
2015-09-01
The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.
Going fully digital: Perspective of a Dutch academic pathology lab
Stathonikos, Nikolas; Veta, Mitko; Huisman, André; van Diest, Paul J.
2013-01-01
During the last years, whole slide imaging has become more affordable and widely accepted in pathology labs. Digital slides are increasingly being used for digital archiving of routinely produced clinical slides, remote consultation and tumor boards, and quantitative image analysis for research purposes and in education. However, the implementation of a fully digital Pathology Department requires an in depth look into the suitability of digital slides for routine clinical use (the image quality of the produced digital slides and the factors that affect it) and the required infrastructure to support such use (the storage requirements and integration with lab management and hospital information systems). Optimization of digital pathology workflow requires communication between several systems, which can be facilitated by the use of open standards for digital slide storage and scanner management. Consideration of these aspects along with appropriate validation of the use of digital slides for routine pathology can pave the way for pathology departments to go “fully digital.” In this paper, we summarize our experiences so far in the process of implementing a fully digital workflow at our Pathology Department and the steps that are needed to complete this process. PMID:23858390
HST archive primer, version 4.1
NASA Technical Reports Server (NTRS)
Fruchter, A. (Editor); Baum, S. (Editor)
1994-01-01
This version of the HST Archive Primer provides the basic information a user needs to know to access the HST archive via StarView the new user interface to the archive. Using StarView, users can search for observations interest, find calibration reference files, and retrieve data from the archive. Both the terminal version of StarView and the X-windows version feature a name resolver which simplifies searches of the HST archive based on target name. In addition, the X-windows version of StarView allows preview of all public HST data; compressed versions of public images are displayed via SAOIMAGE, while spectra are plotted using the public plotting package, XMGR. Finally, the version of StarView described here features screens designed for observers preparing Cycle 5 HST proposals.
The EXOSAT database and archive
NASA Technical Reports Server (NTRS)
Reynolds, A. P.; Parmar, A. N.
1992-01-01
The EXOSAT database provides on-line access to the results and data products (spectra, images, and lightcurves) from the EXOSAT mission as well as access to data and logs from a number of other missions (such as EINSTEIN, COS-B, ROSAT, and IRAS). In addition, a number of familiar optical, infrared, and x ray catalogs, including the Hubble Space Telescope (HST) guide star catalog are available. The complete database is located at the EXOSAT observatory at ESTEC in the Netherlands and is accessible remotely via a captive account. The database management system was specifically developed to efficiently access the database and to allow the user to perform statistical studies on large samples of astronomical objects as well as to retrieve scientific and bibliographic information on single sources. The system was designed to be mission independent and includes timing, image processing, and spectral analysis packages as well as software to allow the easy transfer of analysis results and products to the user's own institute. The archive at ESTEC comprises a subset of the EXOSAT observations, stored on magnetic tape. Observations of particular interest were copied in compressed format to an optical jukebox, allowing users to retrieve and analyze selected raw data entirely from their terminals. Such analysis may be necessary if the user's needs are not accommodated by the products contained in the database (in terms of time resolution, spectral range, and the finesse of the background subtraction, for instance). Long-term archiving of the full final observation data is taking place at ESRIN in Italy as part of the ESIS program, again using optical media, and ESRIN have now assumed responsibility for distributing the data to the community. Tests showed that raw observational data (typically several tens of megabytes for a single target) can be transferred via the existing networks in reasonable time.
Ultrasound Picture Archiving And Communication Systems
NASA Astrophysics Data System (ADS)
Koestner, Ken; Hottinger, C. F.
1982-01-01
The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.
One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving
NASA Astrophysics Data System (ADS)
Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge
1987-10-01
A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.
Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys
NASA Astrophysics Data System (ADS)
Hoffmann, Samantha L.; Avila, Roberto J.
2017-06-01
The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.
Development of Time-Series Human Settlement Mapping System Using Historical Landsat Archive
NASA Astrophysics Data System (ADS)
Miyazaki, H.; Nagai, M.; Shibasaki, R.
2016-06-01
Methodology of automated human settlement mapping is highly needed for utilization of historical satellite data archives for urgent issues of urban growth in global scale, such as disaster risk management, public health, food security, and urban management. As development of global data with spatial resolution of 10-100 m was achieved by some initiatives using ASTER, Landsat, and TerraSAR-X, next goal has targeted to development of time-series data which can contribute to studies urban development with background context of socioeconomy, disaster risk management, public health, transport and other development issues. We developed an automated algorithm to detect human settlement by classification of built-up and non-built-up in time-series Landsat images. A machine learning algorithm, Local and Global Consistency (LLGC), was applied with improvements for remote sensing data. The algorithm enables to use MCD12Q1, a MODIS-based global land cover map with 500-m resolution, as training data so that any manual process is not required for preparation of training data. In addition, we designed the method to composite multiple results of LLGC into a single output to reduce uncertainty. The LLGC results has a confidence value ranging 0.0 to 1.0 representing probability of built-up and non-built-up. The median value of the confidence for a certain period around a target time was expected to be a robust output of confidence to identify built-up or non-built-up areas against uncertainties in satellite data quality, such as cloud and haze contamination. Four scenes of Landsat data for each target years, 1990, 2000, 2005, and 2010, were chosen among the Landsat archive data with cloud contamination less than 20%.We developed a system with the algorithms on the Data Integration and Analysis System (DIAS) in the University of Tokyo and processed 5200 scenes of Landsat data for cities with more than one million people worldwide.
Stewardship of very large digital data archives
NASA Technical Reports Server (NTRS)
Savage, Patric
1991-01-01
An archive is a permanent store. There are relatively few very large digital data archives in existence. Most business records are expired within five or ten years. Many kinds of business records that do have long lives are embedded in data bases that are continually updated and re-issued cyclically. Also, a great deal of permanent business records are actually archived as microfilm, fiche, or optical disk images - their digital version being an operational convenience rather than an archive. The problems forseen in stewarding the very large digital data archives that will accumulate during the mission of the Earth Observing System (EOS) are addressed. It focuses on the function of shepherding archived digital data into an endless future. Stewardship entails storing and protecting the archive and providing meaningful service to the community of users. The steward will (1) provide against loss due to physical phenomena; (2) assure that data is not lost due to storage technology obsolescence; and (3) maintain data in a current formatting methodology.
Enhancement of real-time EPICS IOC PV management for the data archiving system
NASA Astrophysics Data System (ADS)
Kim, Jae-Ha
2015-10-01
The operation of a 100-MeV linear proton accelerator, the major driving values and experimental data need to be archived. According to the experimental conditions, different data are required. Functions that can add new data and delete data in real time need to be implemented. In an experimental physics and industrial control system (EPICS) input output controller (IOC), the value of process variables (PVs) are matched with the driving values and data. The PV values are archived in text file format by using the channel archiver. There is no need to create a database (DB) server, just a need for large hard disk. Through the web, the archived data can be loaded, and new PV values can be archived without stopping the archive engine. The details of the implementation of a data archiving system with channel archiver are presented, and some preliminary results are reported.
A review of images of nurses and smoking on the World Wide Web.
Sarna, Linda; Bialous, Stella Aguinaga
2012-01-01
With the advent of the World Wide Web, historic images previously having limited distributions are now widely available. As tobacco use has evolved, so have images of nurses related to smoking. Using a systematic search, the purpose of this article is to describe types of images of nurses and smoking available on the World Wide Web. Approximately 10,000 images of nurses and smoking published over the past century were identified through search engines and digital archives. Seven major themes were identified: nurses smoking, cigarette advertisements, helping patients smoke, "naughty" nurse, teaching women to smoke, smoking in and outside of health care facilities, and antitobacco images. The use of nursing images to market cigarettes was known but the extent of the use of these images has not been reported previously. Digital archives can be used to explore the past, provide a perspective for understanding the present, and suggest directions for the future in confronting negative images of nursing. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Cunha, George M.
This Records and Archives Management Programme (RAMP) study is intended to assist in the development of basic training programs and courses in document preservation and restoration, and to promote harmonization of such training both within the archival profession and within the broader information field. Based on the assumption that conservation…
Digital Archive Issues from the Perspective of an Earth Science Data Producer
NASA Technical Reports Server (NTRS)
Barkstrom, Bruce R.
2004-01-01
Contents include the following: Introduction. A Producer Perspective on Earth Science Data. Data Producers as Members of a Scientific Community. Some Unique Characteristics of Scientific Data. Spatial and Temporal Sampling for Earth (or Space) Science Data. The Influence of the Data Production System Architecture. The Spatial and Temporal Structures Underlying Earth Science Data. Earth Science Data File (or Relation) Schemas. Data Producer Configuration Management Complexities. The Topology of Earth Science Data Inventories. Some Thoughts on the User Perspective. Science Data User Communities. Spatial and Temporal Structure Needs of Different Users. User Spatial Objects. Data Search Services. Inventory Search. Parameter (Keyword) Search. Metadata Searches. Documentation Search. Secondary Index Search. Print Technology and Hypertext. Inter-Data Collection Configuration Management Issues. An Archive View. Producer Data Ingest and Production. User Data Searching and Distribution. Subsetting and Supersetting. Semantic Requirements for Data Interchange. Tentative Conclusions. An Object Oriented View of Archive Information Evolution. Scientific Data Archival Issues. A Perspective on the Future of Digital Archives for Scientific Data. References Index for this paper.
Digital Image Support in the ROADNet Real-time Monitoring Platform
NASA Astrophysics Data System (ADS)
Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.
2004-12-01
The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.
Between Oais and Agile a Dynamic Data Management Approach
NASA Astrophysics Data System (ADS)
Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.
2015-12-01
In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support responsive interactions with the community. The Result entities describe the outcomes of the plans. This includes Acquisitions. Mitigations and Accepted Risks. With risk acceptance permitting imperfect but functional solutions that can be realistically supported within an archives resource levels
Architecture for a PACS primary diagnosis workstation
NASA Astrophysics Data System (ADS)
Shastri, Kaushal; Moran, Byron
1990-08-01
A major factor in determining the overall utility of a medical Picture Archiving and Communications (PACS) system is the functionality of the diagnostic workstation. Meyer-Ebrecht and Wendler [1] have proposed a modular picture computer architecture with high throughput and Perry et.al [2] have defined performance requirements for radiology workstations. In order to be clinically useful, a primary diagnosis workstation must not only provide functions of current viewing systems (e.g. mechanical alternators [3,4]) such as acceptable image quality, simultaneous viewing of multiple images, and rapid switching of image banks; but must also provide a diagnostic advantage over the current systems. This includes window-level functions on any image, simultaneous display of multi-modality images, rapid image manipulation, image processing, dynamic image display (cine), electronic image archival, hardcopy generation, image acquisition, network support, and an easy user interface. Implementation of such a workstation requires an underlying hardware architecture which provides high speed image transfer channels, local storage facilities, and image processing functions. This paper describes the hardware architecture of the Siemens Diagnostic Reporting Console (DRC) which meets these requirements.
The public cancer radiology imaging collections of The Cancer Imaging Archive.
Prior, Fred; Smith, Kirk; Sharma, Ashish; Kirby, Justin; Tarbox, Lawrence; Clark, Ken; Bennett, William; Nolan, Tracy; Freymann, John
2017-09-19
The Cancer Imaging Archive (TCIA) is the U.S. National Cancer Institute's repository for cancer imaging and related information. TCIA contains 30.9 million radiology images representing data collected from approximately 37,568 subjects. This data is organized into collections by tumor-type with many collections also including analytic results or clinical data. TCIA staff carefully de-identify and curate all incoming collections prior to making the information available via web browser or programmatic interfaces. Each published collection within TCIA is assigned a Digital Object Identifier that references the collection. Additionally, researchers who use TCIA data may publish the subset of information used in their analysis by requesting a TCIA generated Digital Object Identifier. This data descriptor is a review of a selected subset of existing publicly available TCIA collections. It outlines the curation and publication methods employed by TCIA and makes available 15 collections of cancer imaging data.
EIR: enterprise imaging repository, an alternative imaging archiving and communication system.
Bian, Jiang; Topaloglu, Umit; Lane, Cheryl
2009-01-01
The enormous number of studies performed at the Nuclear Medicine Department of University of Arkansas for Medical Sciences (UAMS) generates a huge amount PET/CT images daily. A DICOM workstation had been used as "mini-PACS" to route all studies, which is historically proven to be slow due to various reasons. However, replacing the workstation with a commercial PACS server is not only cost inefficient; and more often, the PACS vendors are reluctant to take responsibility for the final integration of these components. Therefore, in this paper, we propose an alternative imaging archiving and communication system called Enterprise Imaging Repository (EIR). EIR consists of two distinguished components: an image processing daemon and a user friendly web interface. EIR not only reduces the overall waiting time of transferring a study from the modalities to radiologists' workstations, but also provides a more preferable presentation.
TCIApathfinder: an R client for The Cancer Imaging Archive REST API.
Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis
2018-06-05
The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.
Scientific and technical photography at NASA Langley Research Center
NASA Astrophysics Data System (ADS)
Davidhazy, Andrew
1994-12-01
As part of my assignment connected with the Scientific and Technical Photography & Lab (STPL) at the NASA Langley Research Center I conducted a series of interviews and observed the day to day operations of the STPL with the ultimate objective of becoming exposed first hand to a scientific and technical photo/imaging department for which my school prepares its graduates. I was also asked to share my observations with the staff in order that these comments and observations might assist the STPL to better serve its customers. Meetings with several individuals responsible for various wind tunnels and with a group that provides photo-optical instrumentation services at the Center gave me an overview of the services provided by the Lab and possible areas for development. In summary form these are some of the observations that resulted from the interviews and daily contact with the STPL facility. (1) The STPL is perceived as a valuable and almost indispensable service group within the organization. This comment was invariably made by everyone. Everyone also seemed to support the idea that the STPL continue to provide its current level of service and quality. (2) The STPL generally is not perceived to be a highly technically oriented group but rather as a provider of high quality photographic illustration and documentation services. In spite of the importance and high marks assigned to the STPL there are several observations that merit consideration and evaluation for possible inclusion into the STPL's scope of expertise and future operating practices. (1) While the care and concern for artistic rendition of subjects is seen as laudable and sometimes valuable, the time that this often requires is seen as interfering with keeping the tunnels operating at maximum productivity. Tunnel managers would like to shorten down-time due to photography, have services available during evening hours and on short notice. It may be of interest to the STPL that tunnel managers are incorporating ever greater imaging capabilities in their facilities. To some extent this could mean a reduced demand for traditional photographic services. (2) The photographic archive is seen as a Center resource. Archiving of images, as well as data, is a matter of concern to the investigators. The early holdings of the Photographic Archives are quickly deteriorating. The relative inaccessibility of the material held in the archives is problematic. (3) In certain cases delivery or preparation of digital image files instead of, or along with, hardcopy is already being perceived by the STPL's customers as desirable. The STPL should make this option available, and the fact that it has, or will have this capability widely known. (4) The STPL needs to continue to provide expert advice and technical imaging support in terms of application information to users of traditional photographic and new electronic imaging systems. Cooperative demo projects might be undertaken to maintain or improve the capabilities of the Lab. (5) STPL personnel do not yet have significant electronic imaging or electronic communication skills and improvements in this is an area could potentially have a positive impact on the Center. (6) High speed photographic or imaging services are often mentioned by the STPL as being of primary importance to their mission but the lab supports very few projects calling for high speed imaging services. Much high speed equipment is in poor state of repair. It is interesting to note that when the operation of lasers, digital imaging or quantitative techniques are requested these are directed to another NASA department. Could joint activities be initiated to solve problems? (7).
Scientific and technical photography at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Davidhazy, Andrew
1994-01-01
As part of my assignment connected with the Scientific and Technical Photography & Lab (STPL) at the NASA Langley Research Center I conducted a series of interviews and observed the day to day operations of the STPL with the ultimate objective of becoming exposed first hand to a scientific and technical photo/imaging department for which my school prepares its graduates. I was also asked to share my observations with the staff in order that these comments and observations might assist the STPL to better serve its customers. Meetings with several individuals responsible for various wind tunnels and with a group that provides photo-optical instrumentation services at the Center gave me an overview of the services provided by the Lab and possible areas for development. In summary form these are some of the observations that resulted from the interviews and daily contact with the STPL facility. (1) The STPL is perceived as a valuable and almost indispensable service group within the organization. This comment was invariably made by everyone. Everyone also seemed to support the idea that the STPL continue to provide its current level of service and quality. (2) The STPL generally is not perceived to be a highly technically oriented group but rather as a provider of high quality photographic illustration and documentation services. In spite of the importance and high marks assigned to the STPL there are several observations that merit consideration and evaluation for possible inclusion into the STPL's scope of expertise and future operating practices. (1) While the care and concern for artistic rendition of subjects is seen as laudable and sometimes valuable, the time that this often requires is seen as interfering with keeping the tunnels operating at maximum productivity. Tunnel managers would like to shorten down-time due to photography, have services available during evening hours and on short notice. It may be of interest to the STPL that tunnel managers are incorporating ever greater imaging capabilities in their facilities. To some extent this could mean a reduced demand for traditional photographic services. (2) The photographic archive is seen as a Center resource. Archiving of images, as well as data, is a matter of concern to the investigators. The early holdings of the Photographic Archives are quickly deteriorating. The relative inaccessibility of the material held in the archives is problematic. (3) In certain cases delivery or preparation of digital image files instead of, or along with, hardcopy is already being perceived by the STPL's customers as desirable. The STPL should make this option available, and the fact that it has, or will have this capability widely known. (4) The STPL needs to continue to provide expert advice and technical imaging support in terms of application information to users of traditional photographic and new electronic imaging systems. Cooperative demo projects might be undertaken to maintain or improve the capabilities of the Lab. (5) STPL personnel do not yet have significant electronic imaging or electronic communication skills and improvements in this is an area could potentially have a positive impact on the Center. (6) High speed photographic or imaging services are often mentioned by the STPL as being of primary importance to their mission but the lab supports very few projects calling for high speed imaging services. Much high speed equipment is in poor state of repair. It is interesting to note that when the operation of lasers, digital imaging or quantitative techniques are requested these are directed to another NASA department. Could joint activities be initiated to solve problems? (7). The STPL could acquire more technical assignments if examples of the areas where they posses expertise would be circulated around the center. The fact that the STPL owns high speed video capability could be 'advertised' among its customer base if there truly was an interest in building up a customer base in this area. The STPL could participate in events like TOPS as an exhibitor, as well as a documenter, of the event.
NASA Astrophysics Data System (ADS)
Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.
1995-05-01
Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.
NASA Technical Reports Server (NTRS)
Simpson, James J.; Harkins, Daniel N.
1993-01-01
Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.
Integration Of An MR Image Network Into A Clinical PACS
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.
1988-06-01
A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.
Synergy with HST and JWST Data Management Systems
NASA Astrophysics Data System (ADS)
Greene, Gretchen; Space Telescope Data Management Team
2014-01-01
The data processing and archive systems for the JWST will contain a petabyte of science data and the best news is that users will have fast access to the latest calibrations through a variety of new services. With a synergistic approach currently underway with the STScI science operations between the Hubble Space Telescope and James Webb Space Telescope data management subsystems (DMS), operational verification is right around the corner. Next year the HST archive will provide scientists on-demand fully calibrated data products via the Mikulski Archive for Space Telescopes (MAST), which takes advantage of an upgraded DMS. This enhanced system, developed jointly with the JWST DMS is based on a new CONDOR distributed processing system capable of reprocessing data using a prioritization queue which runs in the background. A Calibration Reference Data System manages the latest optimal configuration for each scientific instrument pipeline. Science users will be able to search and discover the growing MAST archive calibrated datasets from these missions along with the other multiple mission holdings both local to MAST and available through the Virtual Observatory. JWST data systems will build upon the successes and lessons learned from the HST legacy and move us forward into the next generation of multi-wavelength archive research.
Document image archive transfer from DOS to UNIX
NASA Technical Reports Server (NTRS)
Hauser, Susan E.; Gill, Michael J.; Thoma, George R.
1994-01-01
An R&D division of the National Library of Medicine has developed a prototype system for automated document image delivery as an adjunct to the labor-intensive manual interlibrary loan service of the library. The document image archive is implemented by a PC controlled bank of optical disk drives which use 12 inch WORM platters containing bitmapped images of over 200,000 pages of medical journals. Following three years of routine operation which resulted in serving patrons with articles both by mail and fax, an effort is underway to relocate the storage environment from the DOS-based system to a UNIX-based jukebox whose magneto-optical erasable 5 1/4 inch platters hold the images. This paper describes the deficiencies of the current storage system, the design issues of modifying several modules in the system, the alternatives proposed and the tradeoffs involved.
Fiber Optic Communication System For Medical Images
NASA Astrophysics Data System (ADS)
Arenson, Ronald L.; Morton, Dan E.; London, Jack W.
1982-01-01
This paper discusses a fiber optic communication system linking ultrasound devices, Computerized tomography scanners, Nuclear Medicine computer system, and a digital fluoro-graphic system to a central radiology research computer. These centrally archived images are available for near instantaneous recall at various display consoles. When a suitable laser optical disk is available for mass storage, more extensive image archiving will be added to the network including digitized images of standard radiographs for comparison purposes and for remote display in such areas as the intensive care units, the operating room, and selected outpatient departments. This fiber optic system allows for a transfer of high resolution images in less than a second over distances exceeding 2,000 feet. The advantages of using fiber optic cables instead of typical parallel or serial communication techniques will be described. The switching methodology and communication protocols will also be discussed.
Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses
NASA Astrophysics Data System (ADS)
Wong, Stephen T. C.; Knowlton, Robert C.; Hoo, Kent S.; Huang, H. K.
1995-05-01
Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the brain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstation to aid the noninvasive presurgical evaluation of epilepsy patients. These techniques include online access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitation of structural and functional information contained in the registered images. For illustration, we describe the use of these techniques in a patient case of nonlesional neocortical epilepsy. We also present out future work based on preliminary studies.
NASA Astrophysics Data System (ADS)
Bardon, Tiphaine; May, Robert K.; Jackson, J. Bianca; Beentjes, Gabriëlle; de Bruin, Gerrit; Taday, Philip F.; Strlič, Matija
2017-04-01
This study aims to objectively inform curators when terahertz time-domain (TD) imaging set in reflection mode is likely to give well-contrasted images of inscriptions in a complex archival document and is a useful non-invasive alternative to current digitisation processes. To this end, the dispersive refractive indices and absorption coefficients from various archival materials are assessed and their influence on contrast in terahertz images from historical documents is explored. Sepia ink and inks produced with bistre or verdigris mixed with a solution of Arabic gum or rabbit skin glue are unlikely to lead to well-contrasted images. However, dispersions of bone black, ivory black, iron gall ink, malachite, lapis lazuli, minium and vermilion are likely to lead to well-contrasted images. Inscriptions written with lamp black, carbon black and graphite give the best imaging results. The characteristic spectral signatures from iron gall ink, minium and vermilion pellets between 5 and 100 cm-1 relate to a ringing effect at late collection times in TD waveforms transmitted through these pellets. The same ringing effect can be probed in waveforms reflected from iron gall, minium and vermilion ink deposits at the surface of a document. Since TD waveforms collected for each scanning pixel can be Fourier-transformed into spectral information, terahertz TD imaging in reflection mode can serve as a hyperspectral imaging tool. However, chemical recognition and mapping of the ink is currently limited by the fact that the morphology of the document influences more the terahertz spectral response of the document than the resonant behaviour of the ink.
NASA Technical Reports Server (NTRS)
Hughes, J.
1998-01-01
The Planetary Data System (PDS) is an active science data archive managed by scientists for NASA's planetary science community. With the advent of the World Wide Web the majority of the archive has been placed on-line as a science digital libraty for access by scientists, the educational community, and the general public.
ERIC Educational Resources Information Center
McCleary, John M.
This Records and Archives Management Programme (RAMP) study covers the conservation of archival documents and the application of freeze-drying to the salvage of documents damaged by flood. Following an introductory discussion of the hazards of water, the study presents a broad summary of data on freeze-drying, including the behavior of…
ERIC Educational Resources Information Center
Miller, Warren; Tanter, Raymond
The International Relations Archive undertakes as its primary goals the acquisition, management and dissemination of international affairs data. The first document enclosed is a copy of the final machine readable codebook prepared for the data from the Political Events Project, 1948-1965. Also included is a copy of the final machine-readable…
Toward a National Computerized Database for Moving Image Materials.
ERIC Educational Resources Information Center
Gartenberg, Jon
This report summarizes a project conducted by a group of catalogers from film archives devoted to nitrate preservation, which explored ways of developing a database to provide a complete film and television information service that would be available nationwide and could contain filmographic data, information on holdings in archives and…
76 FR 43960 - NARA Records Reproduction Fees
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
... transferred to NARA and maintain its fee schedule on NARA's Web site http://www.archives.gov . The proposed... document is faint or too dark, it requires additional time to obtain a readable image. In TABLE 1 below... our Web site ( http://www.archives.gov ) annually when announcing that records reproduction fees will...
Spatial Metadata for Global Change Investigations Using Remote Sensing
NASA Technical Reports Server (NTRS)
Emerson, Charles W.; Quattrochi, Dale A.; Lam, Nina Siu-Ngan; Arnold, James E. (Technical Monitor)
2002-01-01
Satellite and aircraft-borne remote sensors have gathered petabytes of data over the past 30+ years. These images are an important resource for establishing cause and effect relationships between human-induced land cover changes and alterations in climate and other biophysical patterns at local to global scales. However, the spatial, temporal, and spectral characteristics of these datasets vary, thus complicating long-term studies involving several types of imagery. As the geographical and temporal coverage, the spectral and spatial resolution, and the number of individual sensors increase, the sheer volume and complexity of available data sets will complicate management and use of the rapidly growing archive of earth imagery. Mining this vast data resource for images that provide the necessary information for climate change studies becomes more difficult as more sensors are launched and more imagery is obtained.
How to make deposition of images a reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guss, J. Mitchell, E-mail: mitchell.guss@sydney.edu.au; McMahon, Brian; School of Molecular Bioscience, The University of Sydney, Sydney, NSW 2006
2014-10-01
An analysis is performed of the technical and financial challenges to be overcome if deposition of primary experimental data is to become routine. The IUCr Diffraction Data Deposition Working Group is investigating the rationale and policies for routine deposition of diffraction images (and other primary experimental data sets). An information-management framework is described that should inform policy directions, and some of the technical and other issues that need to be addressed in an effort to achieve such a goal are analysed. In the near future, routine data deposition could be encouraged at one of the growing number of institutional repositoriesmore » that accept data sets or at a generic data-publishing web repository service. To realise all of the potential benefits of depositing diffraction data, specialized archives would be preferable. Funding such an initiative will be challenging.« less
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
Web tools for large-scale 3D biological images and atlases
2012-01-01
Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wandtke, John; Robinson, Arvin E.
1999-07-01
The selection criteria for the archive were based on the objectives of the Medical Information, Communication and Archive System (MICAS), a multi-vendor incremental approach to PACS. These objectives include interoperability between all components, seamless integration of the Radiology Information System (RIS) with MICAS and eventually other hospital databases, all components must demonstrate DICOM compliance prior to acceptance and automated workflow that can be programmed to meet changes in the healthcare environment. The long-term multi-modality archive is being implemented in 3 or more phases with the first phase designed to provide a 12 to 18 month storage solution. This decision was made because the cost per GB of storage is rapidly decreasing and the speed at which data can be retrieved is increasing with time. The open-solution selected allows incorporation of leading edge, 'best of breed' hardware and software and provides maximum jukeboxes, provides maximum flexibility of workflow both within and outside of radiology. The selected solution is media independent, supports multiple jukeboxes, provides expandable storage capacity and will provide redundancy and fault tolerance at minimal cost. Some of the required attributes of the archive include scalable archive strategy, virtual image database with global query and object-oriented database. The selection process took approximately 10 months with Cemax-Icon being the vendor selected. Prior to signing a purchase order, Cemax-Icon performed a site survey, agreed upon the acceptance test protocol and provided a written guarantee of connectivity between their archive and the imaging modalities and other MICAS components.
Multiphoton fluorescence lifetime imaging of chemotherapy distribution in solid tumors
NASA Astrophysics Data System (ADS)
Carlson, Marjorie; Watson, Adrienne L.; Anderson, Leah; Largaespada, David A.; Provenzano, Paolo P.
2017-11-01
Doxorubicin is a commonly used chemotherapeutic employed to treat multiple human cancers, including numerous sarcomas and carcinomas. Furthermore, doxorubicin possesses strong fluorescent properties that make it an ideal reagent for modeling drug delivery by examining its distribution in cells and tissues. However, while doxorubicin fluorescence and lifetime have been imaged in live tissue, its behavior in archival samples that frequently result from drug and treatment studies in human and animal patients, and murine models of human cancer, has to date been largely unexplored. Here, we demonstrate imaging of doxorubicin intensity and lifetimes in archival formalin-fixed paraffin-embedded sections from mouse models of human cancer with multiphoton excitation and multiphoton fluorescence lifetime imaging microscopy (FLIM). Multiphoton excitation imaging reveals robust doxorubicin emission in tissue sections and captures spatial heterogeneity in cells and tissues. However, quantifying the amount of doxorubicin signal in distinct cell compartments, particularly the nucleus, often remains challenging due to strong signals in multiple compartments. The addition of FLIM analysis to display the spatial distribution of excited state lifetimes clearly distinguishes between signals in distinct compartments such as the cell nuclei versus cytoplasm and allows for quantification of doxorubicin signal in each compartment. Furthermore, we observed a shift in lifetime values in the nuclei of transformed cells versus nontransformed cells, suggesting a possible diagnostic role for doxorubicin lifetime imaging to distinguish normal versus transformed cells. Thus, data here demonstrate that multiphoton FLIM is a highly sensitive platform for imaging doxorubicin distribution in normal and diseased archival tissues.
Archiving of interferometric radio and mm/submm data at the National Radio Astronomy Observatory
NASA Astrophysics Data System (ADS)
Lacy, Mark
2018-06-01
Modern radio interferometers such as ALMA and the VLA are capable of producing ~1TB/day of data for processing into image products of comparable size. Besides the shear volume of data, the products themselves can be complicated and are sometimes hard to map into standard astronomical archive metadata. We also face similar issues to those faced by archives at other wavelengths, namely the role of archives as the basis of reprocessing platforms and facilities, and the validation and ingestion of user-derived products. In this talk I shall discuss the plans of NRAO in these areas over the next decade.
USGS Releases Landsat Orthorectified State Mosaics
,
2005-01-01
The U.S. Geological Survey (USGS) National Remote Sensing Data Archive, located at the USGS Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota, maintains the Landsat orthorectified data archive. Within the archive are Landsat Enhanced Thematic Mapper Plus (ETM+) data that have been pansharpened and orthorectified by the Earth Satellite Corporation. This imagery has acquisition dates ranging from 1999 to 2001 and was created to provide users with access to quality-screened, high-resolution satellite images with global coverage over the Earth's landmasses.
Clegg, G; Roebuck, S; Steedman, D
2001-01-01
Objectives—To develop a computer based storage system for clinical images—radiographs, photographs, ECGs, text—for use in teaching, training, reference and research within an accident and emergency (A&E) department. Exploration of methods to access and utilise the data stored in the archive. Methods—Implementation of a digital image archive using flatbed scanner and digital camera as capture devices. A sophisticated coding system based on ICD 10. Storage via an "intelligent" custom interface. Results—A practical solution to the problems of clinical image storage for teaching purposes. Conclusions—We have successfully developed a digital image capture and storage system, which provides an excellent teaching facility for a busy A&E department. We have revolutionised the practice of the "hand-over meeting". PMID:11435357
What Is A Picture Archiving And Communication System (PACS)?
NASA Astrophysics Data System (ADS)
Marceau, Carla
1982-01-01
A PACS is a digital system for acquiring, storing, moving and displaying picture or image information. It is an alternative to film jackets that has been made possible by recent breakthroughs in computer technology: telecommunications, local area nets and optical disks. The fundamental concept of the digital representation of image information is introduced. It is shown that freeing images from a material representation on film or paper leads to a dramatic increase in flexibility in our use of the images. The ultimate goal of a medical PACS system is a radiology department without film jackets. The inherent nature of digital images and the power of the computer allow instant free "copies" of images to be made and thrown away. These copies can be transmitted to distant sites in seconds, without the "original" ever leaving the archives of the radiology department. The result is a radiology department with much freer access to patient images and greater protection against lost or misplaced image information. Finally, images in digital form can be treated as data for the computer in image processing, which includes enhancement, reconstruction and even computer-aided analysis.
Extending the XNAT archive tool for image and analysis management in ophthalmology research
NASA Astrophysics Data System (ADS)
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
2013-03-01
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
The Hubble Spectroscopic Legacy Archive
NASA Astrophysics Data System (ADS)
Peeples, M.; Tumlinson, J.; Fox, A.; Aloisi, A.; Fleming, S.; Jedrzejewski, R.; Oliveira, C.; Ayres, T.; Danforth, C.; Keeney, B.; Jenkins, E.
2017-04-01
With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The goal of the Hubble Spectroscopic Legacy Archive(HSLA) is to provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS)and the Space Telescope Imaging Spectrograph (STIS). These data are packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability makes the data easy for users to quickly access, assess the quality of,and download for archival science. The first generation of these products for the far-ultraviolet (FUV) modes of COS was made available online via the Mikulski Archive for Space Telescopes (MAST) in early 2016 and updated in early 2017; future releases will include COS/NUV and STIS/UV data.
Assessing the impact of a radiology information management system in the emergency department
NASA Astrophysics Data System (ADS)
Redfern, Regina O.; Langlotz, Curtis P.; Lowe, Robert A.; Horii, Steven C.; Abbuhl, Stephanie B.; Kundel, Harold L.
1998-07-01
To evaluate a conventional radiology image management system, by investigating information accuracy, and information delivery. To discuss the customization of a picture archival and communication system (PACS), integrated radiology information system (RIS) and hospital information system (HIS) to a high volume emergency department (ED). Materials and Methods: Two data collection periods were completed. After the first data collection period, a change in work rules was implemented to improve the quality of data in the image headers. Data from the RIS, the ED information system, and the HIS as well as observed time motion data were collected for patients admitted to the ED. Data accuracy, patient waiting times, and radiology exam information delivery were compared. Results: The percentage of examinations scheduled in the RIS by the technologists increased from 0% (0 of 213) during the first period to 14% (44 of 317) during the second (p less than 0.001). The percentage of images missing identification numbers decreased from 36% (98 of 272) during the first data collection period to 10% (56 of 562) during the second period (p less than 0.001). Conclusions: Radiologic services in a high-volume ED, requiring rapid service, present important challenges to a PACS system. Strategies can be implemented to improve accuracy and completeness of the data in PACS image headers in such an environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopher Slominski
2009-10-01
Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less
Potential time savings to radiology department personnel in a PACS-based environment
NASA Astrophysics Data System (ADS)
Saarinen, Allan O.; Wilson, M. C.; Iverson, Scott C.; Loop, John W.
1990-08-01
A purported benefit of digital imaging and archiving of radiographic procedures is the presumption of time savings to radiologists, radiology technologists, and radiology departmentpersonnel involved with processingfilms and managing theflimfile room. As part of the University of Washington's evaluation of Picture Archiving and Communication Systems (PACS)for the U.S. Army Medical Research and Development Command, a study was performed which evaluated the current operationalpractices of the film-based radiology department at the University of Washington Medical Center (UWMC). Industrial engineering time and motion studies were conducted to document the length of time requiredforfilm processing in various modalities, the proportion of the total exam time usedforfilm processing, the amount of time radiologists spent searchingfor and looking at images, and the amount of time file room personnel spent collating reports, making loans, updatingfilm jacket information, and purging files. This evaluation showed that better than one-half of the tasks in the file room may be eliminated with PACS and radiologists may save easily 10 percent of the time they spend reading films by no longer having to searchforfilms. Radiology technologists may also save as much as 10 percent of their time with PACS, although this estimate is subject to significant patient mix aberrations and measurement error. Given that the UWMC radiology department operates efficiently, similar improvements are forecast for other radiology departments and larger improvements areforecastfor less efficient departments.
The Cancer Imaging Archive (TCIA) | Informatics Technology for Cancer Research (ITCR)
TCIA is NCI’s repository for publicly shared cancer imaging data. TCIA collections include radiology and pathology images, clinical and clinical trial data, image derived annotations and quantitative features and a growing collection of related ‘omics data both from clinical and pre-clinical studies.
Photo CD and Other Digital Imaging Technologies: What's out There and What's It For?
ERIC Educational Resources Information Center
Chen, Ching-Chih
1993-01-01
Describes Kodak's Photo CD technology and its impact on digital imaging. Color desktop publishing, image processing and preservation, image archival storage, and interactive multimedia development, as well as the equipment, software, and services that make these applications possible, are described. Contact information for developers and…
Archived data management system in Kentucky.
DOT National Transportation Integrated Search
2007-05-01
Archived Data User Service (ADUS) was added to the national ITS architecture in 1999 to enable multiple uses for ITS-generated data. In Kentucky, ARTIMIS and TRIMARC are collecting volume, speed, occupancy, length-based classification, and incident d...
36 CFR 1233.20 - How are disposal clearances managed for records in NARA Federal Records Centers?
Code of Federal Regulations, 2010 CFR
2010-07-01
... NARA Federal Records Centers Program Web site (http://www.archives.gov/frc/toolkit.html#disposition...) or individual NARA Federal Records Centers (http://www.archives.gov/frc/locations.html), individual...
ESA Planetary Science Archive Architecture and Data Management
NASA Astrophysics Data System (ADS)
Arviset, C.; Barbarisi, I.; Besse, S.; Barthelemy, M.; de Marchi, G.; Docasal, R.; Fraga, D.; Grotheer, E.; Heather, D.; Laantee, C.; Lim, T.; Macfarlane, A.; Martinez, S.; Montero, A.; Osinde, J.; Rios, C.; Saiz, J.; Vallat, C.
2018-04-01
The Planetary Science Archive is the European Space Agency repository of science data from all planetary science and exploration missions. This paper presents PSA's content, architecture, user interfaces, and the relation between the PSA and IPDA.
Data Management in the Euclid Science Archive System
NASA Astrophysics Data System (ADS)
de Teodoro, P.; Nieto, S.; Altieri, B.
2017-06-01
Euclid is the ESA M2 mission and a milestone in the understanding of the geometry of the Universe. In total Euclid will produce up to 26 PB per year of observations. The Science Archive Systems (SAS) belongs to the Euclid Archive System (EAS) that sits in the core of the Euclid Science Ground Segment (SGS). The SAS is being built at the ESAC Science Data Centre (ESDC), which is responsible for the development and operations of the scientific archives for the Astronomy, Planetary and Heliophysics missions of ESA. The SAS is focused on the needs of the scientific community and is intended to provide access to the most valuable scientific metadata from the Euclid mission. In this paper we describe the architectural design of the system, implementation progress and the main challenges from the data management point of view in the building of the SAS.
Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.
2007-01-01
In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Modifying the Heliophysics Data Policy to Better Enable Heliophysics Research
NASA Technical Reports Server (NTRS)
Hayes, Jeffrey; Roberts, D. Aaron; Bredekamp, Joseph
2008-01-01
The Heliophysics (HP) Science Data Management Policy, adopted by HP in June 2007, has helped to provide a structure for the HP data lifecycle. It provides guidelines for Project Data Management Plans and related documents, initiates Resident Archives to maintain data services after a mission ends, and outlines a route to the unification of data finding, access, and distribution through Virtual observatories. Recently we have filled in missing pieces that assure more coherence and a home for the VxOs (through the 'Heliophsyics Data and Model Consortium'), and provide greater clarity with respect to long term archiving. In particular, the new policy which has been vetted with many community members, details the 'Final Archives' that are to provide long-term data access. These are distinguished from RAs in that they provide little additional service beyond servicing data, but critical to their success is that the final archival materials include calibrated data in useful formats such as one finds in CDAWeb and various ASCII or FITS archives. Having a clear goal for legacy products, to be detailed as part of the Mission Archives Plans presented at Senior Reviews, will help to avoid the situation so common in the past of having archival products that preserve bits well but not readily usable information. We hope to avoid the need for the large numbers of 'data upgrade' projects that have been necessary in recent years.
The Hubble Spectroscopic Legacy Archive
NASA Astrophysics Data System (ADS)
Peeples, Molly S.; Tumlinson, Jason; Fox, Andrew; Aloisi, Alessandra; Ayres, Thomas R.; Danforth, Charles; Fleming, Scott W.; Jenkins, Edward B.; Jedrzejewski, Robert I.; Keeney, Brian A.; Oliveira, Cristina M.
2016-01-01
With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The Hubble Spectroscopic Legacy Archive will provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS) and the Space Telescope Imaging Spectrograph (STIS). These data will be packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability will make the data easy for users to quickly access, assess the quality of, and download for archival science starting in Cycle 24, with the first generation of these products for the FUV modes of COS available online via MAST in early 2016.
VizieR Online Data Catalog: Proper motions and photometry of stars in NGC 3201 (Sariya+, 2017)
NASA Astrophysics Data System (ADS)
Sariya, D. P.; Jiang, I.-G.; Yadav, R. K. S.
2017-07-01
To determine the PMs of the stars in this work, we used archive images (http://archive.eso.org/eso/esoarchivemain.html) from observations made with the 2.2m ESO/MPI telescope at La Silla, Chile. This telescope contains a mosaic camera called the Wide-Field Imager (WFI), consisting of 4*2 (i.e., 8 CCD chips). Since each CCD has an array of 2048*4096 pixels, WFI ultimately produces images with a 34*33arcmin2 field of view. The observational run of the first epoch contains two images in B,V and I bands, each with 240s exposure time observed on 1999 December 05. In the second epoch, we have 35 images with 40s exposure time each in V filter observed during the period of 2014 April 02-05. Thus the epoch gap between the data is ~14.3 years. (2 data files).
The development of a digitising service centre for natural history collections
Tegelberg, Riitta; Haapala, Jaana; Mononen, Tero; Pajari, Mika; Saarenmaa, Hannu
2012-01-01
Abstract Digitarium is a joint initiative of the Finnish Museum of Natural History and the University of Eastern Finland. It was established in 2010 as a dedicated shop for the large-scale digitisation of natural history collections. Digitarium offers service packages based on the digitisation process, including tagging, imaging, data entry, georeferencing, filtering, and validation. During the process, all specimens are imaged, and distance workers take care of the data entry from the images. The customer receives the data in Darwin Core Archive format, as well as images of the specimens and their labels. Digitarium also offers the option of publishing images through Morphbank, sharing data through GBIF, and archiving data for long-term storage. Service packages can also be designed on demand to respond to the specific needs of the customer. The paper also discusses logistics, costs, and intellectual property rights (IPR) issues related to the work that Digitarium undertakes. PMID:22859879
NASA Astrophysics Data System (ADS)
Tokareva, Victoria
2018-04-01
New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.
Open Technologies at Athabasca University's Geospace Observatories
NASA Astrophysics Data System (ADS)
Connors, M. G.; Schofield, I. S.
2012-12-01
Athabasca University Geophysical Observatories feature two auroral observation sites situated in the subauroral zone of western Canada, separated by approximately 25 km. These sites are both on high-speed internet and ideal for observing phenomena detectable from this latitude, which include noctilucent clouds, meteors, and magnetic and optical aspects of the aurora. General aspects of use of Linux in observatory management are described, with emphasis on recent imaging projects involving control of high resolution digital SLR cameras at low cadence, and inexpensive white light analog video cameras at 30 Hz. Linux shell scripts are extensively used, with image capture controlled by gphoto2, the ivtv-utils package, x264 video coding library, and ffmpeg. Imagemagick allows processing of images in an automated fashion. Image archives and movies are created and can be correlated with magnetic data. Much of the magnetic data stream also uses GMT (Generic Mapping Tools) within shell scripts for display. Additionally, SPASE metadata are generated for most of the magnetic data, thus allowing users of our AUTUMN magnetic data repository to perform SPASE queries on the dataset. Visualization products from our twin observatories will be presented.
Home Economics Archive: Research, Tradition and History (HEARTH)
, Tradition and History HEARTH is a core electronic collection of books and journals in Home Economics and Intimate History of American Girls. Additional information, images and readings on the history of Home Archive: Research, Tradition and History (HEARTH). Ithaca, NY: Albert R. Mann Library, Cornell University
The archiving and dissemination of biological structure data.
Berman, Helen M; Burley, Stephen K; Kleywegt, Gerard J; Markley, John L; Nakamura, Haruki; Velankar, Sameer
2016-10-01
The global Protein Data Bank (PDB) was the first open-access digital archive in biology. The history and evolution of the PDB are described, together with the ways in which molecular structural biology data and information are collected, curated, validated, archived, and disseminated by the members of the Worldwide Protein Data Bank organization (wwPDB; http://wwpdb.org). Particular emphasis is placed on the role of community in establishing the standards and policies by which the PDB archive is managed day-to-day. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
U.S. Geological Survey, remote sensing, and geoscience data: Using standards to serve us all
Benson, Michael G.; Faundeen, John L.
2000-01-01
The U.S. Geological Survey (USGS) advocates the use of standards with geosciences and remotely sensed data and metadata for its own purposes and those of its customers. In activities that range from archiving data to making a product, the incorporation of standards makes these functions repeatable and understandable. More important, when accepted standards are followed, data discovery and sharing can be more efficient and the overall value to society can be expanded. The USGS archives many terabytes of digital geoscience and remotely sensed data. Several million photographs are also available to the research community. To manage these vast holdings and ensure that strict preservation and high usability criteria are observed, the USGS uses standards within the archival, data management, public access and ordering, and data distribution areas. The USGS uses Federal and international standards in performing its role as the U.S. National Satellite Land Remote Sensing Data Archive and in its mission as the long-term archive and production center for aerial photographs and cartographic data covering the United States.
Planetary Data Archiving Plan at JAXA
NASA Astrophysics Data System (ADS)
Shinohara, Iku; Kasaba, Yasumasa; Yamamoto, Yukio; Abe, Masanao; Okada, Tatsuaki; Imamura, Takeshi; Sobue, Shinichi; Takashima, Takeshi; Terazono, Jun-Ya
After the successful rendezvous of Hayabusa with the small-body planet Itokawa, and the successful launch of Kaguya to the moon, Japanese planetary community has gotten their own and full-scale data. However, at this moment, these datasets are only available from the data sites managed by each mission team. The databases are individually constructed in the different formats, and the user interface of these data sites is not compatible with foreign databases. To improve the usability of the planetary archives at JAXA and to enable the international data exchange smooth, we are investigating to make a new planetary database. Within a coming decade, Japan will have fruitful datasets in the planetary science field, Venus (Planet-C), Mercury (BepiColombo), and several missions in planning phase (small-bodies). In order to strongly assist the international scientific collaboration using these mission archive data, the planned planetary data archive at JAXA should be managed in an unified manner and the database should be constructed in the international planetary database standard style. In this presentation, we will show the current status and future plans of the planetary data archiving at JAXA.
Clinical aspects of the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Forbes, Glenn S.; Morin, Richard L.; Pavlicek, William
1991-07-01
A joint project between Mayo Clinic and IBM to develop a picture archival and communications system has been under development for three years. This project began as a potential solution to a pressing archival problem in magnetic resonance imaging. The project has grown to encompass a much larger sphere of activity including workstations, image retrieval, and report archival. This report focuses on the clinical aspects involved in the design, development, and implementation of such a system. In particular, emphasis is placed on the clinical impact of the system both inside and outside of the radiology department. The primary concerns have centered on fidelity of archival data, ease of use, and diagnostic efficacy. The project to date has been limited to neuroradiology practice. This group consists of nine staff radiologists and fellows. Administrative policy decisions regarding the accessibility and available of digital data in the clinical environment have been much more difficult and complex than originally conceived. Based on the observations thus far, the authors believe the system will become a useful and valuable adjunct to clinical practice of radiology.
NASA Astrophysics Data System (ADS)
Cogliati, M.; Tonelli, E.; Battaglia, D.; Scaioni, M.
2017-12-01
Archive aerial photos represent a valuable heritage to provide information about land content and topography in the past years. Today, the availability of low-cost and open-source solutions for photogrammetric processing of close-range and drone images offers the chance to provide outputs such as DEM's and orthoimages in easy way. This paper is aimed at demonstrating somehow and to which level of accuracy digitized archive aerial photos may be used within a such kind of low-cost software (Agisoft Photoscan Professional®) to generate photogrammetric outputs. Different steps of the photogrammetric processing workflow are presented and discussed. The main conclusion is that this procedure may come to provide some final products, which however do not feature the high accuracy and resolution that may be obtained using high-end photogrammetric software packages specifically designed for aerial survey projects. In the last part a case study is presented about the use of four-epoch archive of aerial images to analyze the area where a tunnel has to be excavated.
NASA Technical Reports Server (NTRS)
Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve
2008-01-01
NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.
Automatic management system for dose parameters in interventional radiology and cardiology.
Ten, J I; Fernandez, J M; Vaño, E
2011-09-01
The purpose of this work was to develop an automatic management system to archive and analyse the major study parameters and patient doses for fluoroscopy guided procedures performed in cardiology and interventional radiology systems. The X-ray systems used for this trial have the capability to export at the end of the procedure and via e-mail the technical parameters of the study and the patient dose values. An application was developed to query and retrieve from a mail server, all study reports sent by the imaging modality and store them on a Microsoft SQL Server data base. The results from 3538 interventional study reports generated by 7 interventional systems were processed. In the case of some technical parameters and patient doses, alarms were added to receive malfunction alerts so as to immediately take appropriate corrective actions.
Wireless-PDA-controlled image workflow from PACS: the next trend in the health care enterprise?
NASA Astrophysics Data System (ADS)
Erberich, Stephan G.; Documet, Jorge; Zhou, Michael Z.; Cao, Fei; Liu, Brent J.; Mogel, Greg T.; Huang, H. K.
2003-05-01
Image workflow in today's Picture Archiving and Communication Systems (PACS) is controlled from fixed Display Workstations (DW) using proprietary control interfaces. A remote access to the Hospital Information System (HIS) and Radiology Information System (RIS) for urgent patient information retrieval does not exist or gradually become available. The lack for remote access and workflow control for HIS and RIS is especially true when it comes to medical images of a PACS on Department or Hospital level. As images become more complex and data sizes expand rapidly with new image techniques like functional MRI, Mammography or routine spiral CT to name a few, the access and manageability becomes an important issue. Long image downloads or incomplete work lists cannot be tolerated in a busy health care environment. In addition, the domain of the PACS is no longer limited to the imaging department and PACS is also being used in the ER and emergency care units. Thus a prompt and secure access and manageability not only by the radiologist, but also from the physician becomes crucial to optimally utilize the PACS in the health care enterprise of the new millennium. The purpose of this paper is to introduce a concept and its implementation of a remote access and workflow control of the PACS combining wireless, Internet and Internet2 technologies. A wireless device, the Personal Digital Assistant (PDA), is used to communicate to a PACS web server that acts as a gateway controlling the commands for which the user has access to the PACS server. The commands implemented for this test-bed are query/retrieve of the patient list and study list including modality, examination, series and image selection and pushing any list items to a selected DW on the PACS network.
Continuing quality improvement procedures for a clinical PACS.
Andriole, K P; Gould, R G; Avrin, D E; Bazzill, T M; Yin, L; Arenson, R L
1998-08-01
The University of California at San Francisco (USCF) Department of Radiology currently has a clinically operational picture archiving and communication system (PACS) that is thirty-five percent filmless, with the goal of becoming seventy-five percent filmless within the year. The design and implementation of the clinical PACS has been a collaborative effort between an academic research laboratory and a commercial vendor partner. Images are digitally acquired from three computed radiography (CR) scanners, five computed tomography (CT) scanners, five magnetic resonance (MR) imagers, three digital fluoroscopic rooms, an ultrasound mini-PACS and a nuclear medicine mini-PACS. The DICOM (Digital Imaging and Communications in Medicine) standard communications protocol and image format is adhered to throughout the PACS. Images are archived in hierarchical staged fashion, on a RAID (redundant array of inexpensive disks) and on magneto-optical disk jukeboxes. The clinical PACS uses an object-oriented Oracle SQL (systems query language) database, and interfaces to the Radiology Information System using the HL7 (Health Languages 7) standard. Components are networked using a combination of switched and fast ethernet, and ATM (asynchronous transfer mode), all over fiber optics. The wide area network links six UCSF sites in San Francisco. A combination of high and medium resolution dual-monitor display stations have been placed throughout the Department of Radiology, the Emergency Department (ED) and Intensive Care Units (ICU). A continuing quality improvement (CQI) committee has been formed to facilitate the PACS installation and training, workflow modifications, quality assurance and clinical acceptance. This committee includes radiologists at all levels (resident, fellow, attending), radiology technologists, film library personnel, ED and ICU clinician end-users, and PACS team members. The CQI committee has proved vital in the creation of new management procedures, providing a means for user feedback and education, and contributing to the overall acceptance of, and user satisfaction with the system. Well developed CQI procedures have been essential to the successful clinical operation of the PACS as UCSF Radiology moves toward a filmless department.
Update Of The ACR-NEMA Standard Committee
NASA Astrophysics Data System (ADS)
Wang, Yen; Best, D. E.; Morse, R. R.; Horii, S. C.; Lehr, J. L.; Lodwick, G. S.; Fuscoe, C.; Nelson, O. L.; Perry, J. R.; Thompson, B. G.; Wessell, W. R.
1988-06-01
In January, 1984, the American College of Radiology (ACR) representing the users of imaging equipment and the National Electrical Manufacturers Association (NEMA) representing the manufacturers of imaging equipment joined forces to create a committee that could solve the compatibility issues surrounding the exchange of digital medical images. This committee, the ACR-NEMA Digital Imaging and Communication Standards Committee was composed of radiologists and experts from industry who addressed the problems involved in interfacing different digital imaging modalities. In just two years, the committee and three of its working groups created an industry standard interface, ACR-NEMA Digital Imaging and Communications Standard, Publication No. 300-1985. The ACR-NEMA interface allows digital medical images and related information to be communicated between different imaging devices, regardless of manufacturer or use of differing image formats. The interface is modeled on the International Standards Organization's Open Systems Interconnection sever-layer reference model. It is believed that the development of the Interface was the first step in the development of standards for Medical Picture Archiving and Communications Systems (PACS). Developing the interface Standard has required intensive technical analysis and examination of the future trends for digital imaging in order to design a model which would not be quickly outmoded. To continue the enhancement and future development of image management systems, various working groups have been created under the direction of the ACR-NEMA Committee.
Demonstrations of Clarus system data : Clarus BAA projects.
DOT National Transportation Integrated Search
2005-08-01
In 2002, FHWA awarded a field operational test to the Virginia Department of Transportation entitled (VDOT) Traffic Management Center (TMC) Applications of Archived Data Operational Test. The intent of the operational test was to use archived data to...
More flexibility in representing geometric distortion in astronomical images
NASA Astrophysics Data System (ADS)
Shupe, David L.; Laher, Russ R.; Storrie-Lombardi, Lisa; Surace, Jason; Grillmair, Carl; Levitan, David; Sesar, Branimir
2012-09-01
A number of popular software tools in the public domain are used by astronomers, professional and amateur alike, but some of the tools that have similar purposes cannot be easily interchanged, owing to the lack of a common standard. For the case of image distortion, SCAMP and SExtractor, available from Astromatic.net, perform astrometric calibration and source-object extraction on image data, and image-data geometric distortion is computed in celestial coordinates with polynomial coefficients stored in the FITS header with the PV i_j keywords. Another widely-used astrometric-calibration service, Astrometry.net, solves for distortion in pixel coordinates using the SIP convention that was introduced by the Spitzer Science Center. Up until now, due to the complexity of these distortion representations, it was very difficult to use the output of one of these packages as input to the other. New Python software, along with faster-computing C-language translations, have been developed at the Infrared Processing and Analysis Center (IPAC) to convert FITS-image headers from PV to SIP and vice versa. It is now possible to straightforwardly use Astrometry.net for astrometric calibration and then SExtractor for source-object extraction. The new software also enables astrometric calibration by SCAMP followed by image visualization with tools that support SIP distortion, but not PV . The software has been incorporated into the image-processing pipelines of the Palomar Transient Factory (PTF), which generate FITS images with headers containing both distortion representations. The software permits the conversion of archived images, such as from the Spitzer Heritage Archive and NASA/IPAC Infrared Science Archive, from SIP to PV or vice versa. This new capability renders unnecessary any new representation, such as the proposed TPV distortion convention.
NASA Astrophysics Data System (ADS)
Bay Hasager, Charlotte; Brøgger Sørensen, Peter; Baltazar Andersen, Ole; Badger, Merete; Højerslev, Niels Kristian; Høyer, Jacob L.; Løkkegaard, Bo; Lichtenegger, Jürg; Nyborg, Lotte; Saldo, Roberto
2010-05-01
Students and teachers may use ONLINE satellite image in the classroom. Images have been archived since August 2006 and the archive is updated every day since. This means that series of nearly four years of daily global images are available online. The parameters include ocean surface temperature, sea level anomaly, ocean wave height, ocean winds, global ozone in the atmosphere and clouds, and sea ice in the Arctic and Antarctica. During the Galathea 3 expedition that took place from August 2006 to April 2007 also many other high-resolution (local to regional) satellite images were acquired and stored in the archive. However after the end of the expedition only global satellite data are collected and stored. Use Google Earth at http://galathea.dtu.dk/GE_e.html to access the images. The expedition included 50 science projects and based on this educational material has been developed. There are around 20 educational projects in English at http://galathea3.emu.dk/satelliteeye/index_uk.html and 90 in Danish at http://vg3.dk/ freely available based on the science. All the educational projects in English deal with satellite image analysis and information. In addition, the short educational film (15min) for students and teachers at higher upper level on the use of satellite images during the expedition and in some science projects onboard is available in English. The film is called ‘Galathea's Eye' and is available at http://virtuelgalathea3.dk/om/videoer. All projects in English were developed in the ‘Satellite Eye for Galathea 3' projected supported by Egmontfonden and ESA Eduspace. The satellite images were mainly from ESA and Eduspace. The Danish projects are support also by Tips og Lottopuljen of Ministry of Education.
NASA Astrophysics Data System (ADS)
Kontoes, Charalampos; Papoutsis, Ioannis; Herekakis, Themistoklis; Michail, Dimitrios; Ieronymidi, Emmanuela
2013-04-01
Remote sensing tools for the accurate, robust and timely assessment of the damages inflicted by forest wildfires provide information that is of paramount importance to public environmental agencies and related stakeholders before, during and after the crisis. The Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing of the National Observatory of Athens (IAASARS/NOA) has developed a fully automatic single and/or multi date processing chain that takes as input archived Landsat 4, 5 or 7 raw images and produces precise diachronic burnt area polygons and damage assessments over the Greek territory. The methodology consists of three fully automatic stages: 1) the pre-processing stage where the metadata of the raw images are extracted, followed by the application of the LEDAPS software platform for calibration and mask production and the Automated Precise Orthorectification Package, developed by NASA, for image geo-registration and orthorectification, 2) the core-BSM (Burn Scar Mapping) processing stage which incorporates a published classification algorithm based on a series of physical indexes, the application of two filters for noise removal using graph-based techniques and the grouping of pixels classified as burnt to form the appropriate pixels clusters before proceeding to conversion from raster to vector, and 3) the post-processing stage where the products are thematically refined and enriched using auxiliary GIS layers (underlying land cover/use, administrative boundaries, etc.) and human logic/evidence to suppress false alarms and omission errors. The established processing chain has been successfully applied to the entire archive of Landsat imagery over Greece spanning from 1984 to 2012, which has been collected and managed in IAASARS/NOA. The number of full Landsat frames that were subject of process in the framework of the study was 415. These burn scar mapping products are generated for the first time to such a temporal and spatial extent and are ideal to use in further environmental time series analyzes, production of statistical indexes (frequency, geographical distribution and number of fires per prefecture) and applications, including change detection and climate change models, urban planning, correlation with manmade activities, etc.
Nagels, Jason; MacDonald, David; Parker, David
2015-04-01
A challenge for many clinical users is that a patient may receive a diagnostic imaging (DI) service at a number of hospitals or private imaging clinics. The DI services that patients receive at other locations could be clinically relevant to current treatments, but typically, there is no seamless method for a clinical user to access longitudinal DI results for their patient. Radiologists, and other specialists that are intensive users of image data, require seamless ingestion of foreign exams into the picture archiving and communication system (PACS) to achieve full clinical value. Most commonly, a clinical user will depend on the patient to bring in a CD that contains imaging from another location. However, a number of issues can arise when using this type of solution. Firstly, a CD will not provide the clinical user with the full longitudinal record of the patient. Secondly, a CD often will not contain the report associated with the images. Finally, a CD is not seamless, due to the need to manually import the contents of the CD into the local PACS. In order to overcome these limitations, and provide clinical users with a greater benefit related to a patient's longitudinal DI history, the implementation of foreign exam management (FEM) at the local site level is required. This paper presents the experiences of FEM in practice. By leveraging industry standards and edge devices to support FEM, multiple sites with disparate PACS and radiology information system (RIS) vendors are able to seamlessly ingest foreign exams within their local PACS as if they are local exams.
Landsat View: Ouagadougou, Burkina Faso
2017-12-08
The landlocked western African nation of Burkina Faso experienced a 200 percent increase in urban population between 1975 and 2000. As a result, the area of the capital city Ouagadougou grew 14-fold during this period. These Landsat images show the city expanding outward from its center in the two decades between 1986 and 2006. On Nov. 18, 1986, the Landsat 5 satellite acquired this image of the capital. This false-color image shows vegetation in shades of green and gray, water in various shades of blue, and urban areas in pink and purple. The runway of the city’s airport can be seen as a long straight line that extends from southwest to northeast south of the large lake, Bois de Boulogne. Two decades later, on Oct. 16, 2006 Landsat 7 acquired this image of Ouagadougou. Growth radiated from the city center in all directions. The green strip of vegetation north of Bois de Boulogne has been paved over and a massive new development including a large thoroughfare and traffic circle can be seen south of the airport. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Landsat View: Western Suburbs of Chicago, Illinois
2017-12-08
Forty miles west of downtown Chicago, the Fox River meanders its way through what has become the westernmost reaches of metropolitan Chicago, where the sprawling metropolis meets the hinterlands. While Chicago itself has seen a seven percent population decline during the last decade, the population of its metropolitan region, "Chicagoland," has steadily increased. These two natural-color Landsat 5 images acquired a quarter-century apart (on May 2, 1985, and May 23, 2010), stand witness to the soaring growth of this region. Aurora, Illinois’ second largest city, is the silvery-green region to the left hugging the Fox River, just south of the I-88 (North is to the right in this image); Carpentersville is found on the rightmost side, north of the I-90. From 1985 to 2010 a development explosion can been seen as the browns of pasture lands give way to silvery-green suburban areas and large white-colored business districts spring up along and east of the river. A major expansion of Dupage Airport appears in the middle of the 2010 image, and the circular-shaped region north of the I-88 and east of the Fox River, visible on both images, is the Department of Energy’s Fermilab. ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
Tokyo is the world’s largest metropolitan region, home to nearly 37 million people. During the past two decades, Tokyo’s population has grown by more than 7 million. The city’s growth has continued despite Japan’s overall stagnating population, mainly due to a continued trend of centralization—citizens moving out of the country and into the city. Landsat 4 collected this first false-color image of Tokyo on Feb. 2, 1989. The upper half of Tokyo Bay is the large water body visible in a dark blue. In the middle of the image, central Tokyo appears a deep purple just north of the bay. Twenty-two years later, Landsat 5, captured this second image of Tokyo on April 5, 2011. The urban reaches of metropolitan Tokyo have grown in both distance and density, as seen where the green color of vegetation has turned to pink and purple shades of urbanization. A major expansion of Tokyo’s Haneda Airport, can be seen south of the city, on land built out into the bay. The constant circular spot of green in the dense city-center, visible on both images, is the Tokyo Imperial Palace and its gardens. (Landsat 5 TM Bands 7,4,2) ---- NASA and the U.S. Department of the Interior through the U.S. Geological Survey (USGS) jointly manage Landsat, and the USGS preserves a 40-year archive of Landsat images that is freely available over the Internet. The next Landsat satellite, now known as the Landsat Data Continuity Mission (LDCM) and later to be called Landsat 8, is scheduled for launch in 2013. In honor of Landsat’s 40th anniversary in July 2012, the USGS released the LandsatLook viewer – a quick, simple way to go forward and backward in time, pulling images of anywhere in the world out of the Landsat archive. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Martinez-Gutierrez, Genaro
Baja California Sur (Mexico), as well as mainland Mexico, is affected by tropical cyclone storms, which originate in the eastern north Pacific. Historical records show that Baja has been damaged by intense summer storms. An arid to semiarid climate characterizes the study area, where precipitation mainly occurs during the summer and winter seasons. Natural and anthropogenic changes have impacted the landscape of southern Baja. The present research documents the effects of tropical storms over the southern region of Baja California for a period of approximately twenty-six years. The goal of the research is to demonstrate how remote sensing can be used to detect the important effects of tropical storms including: (a) evaluation of change detection algorithms, and (b) delineating changes to the landscape including coastal modification, fluvial erosion and deposition, vegetation change, river avulsion using change detection algorithms. Digital image processing methods with temporal Landsat satellite remotely sensed data from the North America Landscape Characterization archive (NALC), Thematic Mapper (TM), and Enhanced Thematic Mapper (ETM) images were used to document the landscape change. Two image processing methods were tested including Image differencing (ID), and Principal Component Analysis (PCA). Landscape changes identified with the NALC archive and TM images showed that the major changes included a rapid change of land use in the towns of San Jose del Cabo and Cabo San Lucas between 1973 and 1986. The features detected using the algorithms included flood deposits within the channels of active streams, erosion banks, and new channels caused by channel avulsion. Despite the 19 year period covered by the NALC data and approximately 10 year intervals between acquisition dates, there were changed features that could be identified in the images. The TM images showed that flooding from Hurricane Isis (1998) produced new large deposits within the stream channels. This research has shown that remote sensing based change detection can delineate the effects of flooding on the landscape at scales down to the nominal resolution of the sensor. These findings indicate that many other applications for change detection are both viable and important. These include disaster response, flood hazard planning, geomorphic studies, water supply management in deserts.
ClearedLeavesDB: an online database of cleared plant leaf images
2014-01-01
Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985
ClearedLeavesDB: an online database of cleared plant leaf images.
Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S
2014-03-28
Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.
NASA Astrophysics Data System (ADS)
Seeram, Euclid
2006-03-01
The large volumes of digital images produced by digital imaging modalities in Radiology have provided the motivation for the development of picture archiving and communication systems (PACS) in an effort to provide an organized mechanism for digital image management. The development of more sophisticated methods of digital image acquisition (Multislice CT and Digital Mammography, for example), as well as the implementation and performance of PACS and Teleradiology systems in a health care environment, have created challenges in the area of image compression with respect to storing and transmitting digital images. Image compression can be reversible (lossless) or irreversible (lossy). While in the former, there is no loss of information, the latter presents concerns since there is a loss of information. This loss of information from diagnostic medical images is of primary concern not only to radiologists, but also to patients and their physicians. In 1997, Goldberg pointed out that "there is growing evidence that lossy compression can be applied without significantly affecting the diagnostic content of images... there is growing consensus in the radiologic community that some forms of lossy compression are acceptable". The purpose of this study was to explore the opinions of expert radiologists, and related professional organizations on the use of irreversible compression in routine practice The opinions of notable radiologists in the US and Canada are varied indicating no consensus of opinion on the use of irreversible compression in primary diagnosis, however, they are generally positive on the notion of the image storage and transmission advantages. Almost all radiologists are concerned with the litigation potential of an incorrect diagnosis based on irreversible compressed images. The survey of several radiology professional and related organizations reveals that no professional practice standards exist for the use of irreversible compression. Currently, the only standard for image compression is stated in the ACR's Technical Standards for Teleradiology and Digital Image Management.
ERIC Educational Resources Information Center
Barromi Perlman, Edna
2011-01-01
An archivist from a kibbutz in the north of Israel has been managing the kibbutz archive for close to a decade. I have chosen to present her enterprise and the role she is playing by means of her archival work, which is changing the historiography of her kibbutz. The archivist at the kibbutz in question reevaluates her kibbutz's history while…
NASA Technical Reports Server (NTRS)
Lapenta, C. C.
1992-01-01
The functionality of the Distributed Active Archive Centers (DAACs) which are significant elements of the Earth Observing System Data and Information System (EOSDIS) is discussed. Each DAAC encompasses the information management system, the data archival and distribution system, and the product generation system. The EOSDIS DAACs are expected to improve the access to earth science data set needed for global change research.
2008-09-05
This image captures the beauty of a major alluvial fan in Tsinghai, a province located in Northwestern China. This archival image was taken from NASA Space Shuttle in 1997 as part of its ISS EarthKAM mission.
Kokaram, Anil C
2004-03-01
Image sequence restoration has been steadily gaining in importance with the increasing prevalence of visual digital media. The demand for content increases the pressure on archives to automate their restoration activities for preservation of the cultural heritage that they hold. There are many defects that affect archived visual material and one central issue is that of Dirt and Sparkle, or "Blotches." Research in archive restoration has been conducted for more than a decade and this paper places that material in context to highlight the advances made during that time. The paper also presents a new and simpler Bayesian framework that achieves joint processing of noise, missing data, and occlusion.
Final Technical Report for DE-SC0002014- July 29, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramirez, NC
2011-07-29
The project titled “National Biorepository for Children’s and Women’s Cancer”. The funding received by the Biopathology Center (BPC) at the Research Institute at Nationwide Children’s Hospital was utilized to procure equipment and add resources to establish a national digital archive of tissues of children and women’s cancers to advance treatment and research. As planned in the proposal, the project allowed the BPC to procure two high-speed imaging robots and hire imaging technicians to scan a large collection of Children’s and Women’s cancer tissues. The BPC team focused on completed clinical trials, with some dating back nearly 30 years, conducted bymore » the Children’s Oncology Group (and its precursor groups) as well as the Gynecologic Oncology Group. A total of 139 clinical trials were imaged as part of the archive project allowing the team to generate 29, 488 images that are currently stored at the Ohio Supercomputer Center located in Columbus Ohio. The images are now integrated with the Virtual Imaging for Pathology, Education and Research (VIPER) application. The VIPER application allows the BPC to make the digital archive available via the Internet to approved researchers remotely eliminating the use of glass slides for this collection. The elimination of glass slides reduces costs associated with shipping, reduces breakage of glass slides and allows for the review of these cases quickly by experts on a standard desktop computer.« less
Feasibility of telemammography as biomedical application for breast imaging
NASA Astrophysics Data System (ADS)
Beckerman, Barbara G.; Batsell, Stephen G.; MacIntyre, Lawrence P.; Sarraf, Hamed S.; Gleason, Shaun S.; Schnall, Mitchell D.
1999-07-01
Mammographic screening is an important tool in the early detection of breast cancer. The migration of mammography from the current mode of x-ray mammography using a film screen image detector and display to a digital technology provides an opportunity to improve access and performance of breast cancer screening. The sheer size and volume of the typical screening exam, the need to have previous screening data readily available, and the need to view other breast imaging data together to provide a common consensus and to plan treatment, make telemammography an ideal application for breast imaging. For telemammography to be a viable option, it must overcome the technical challenges related to transmission, archiving, management, processing and retrieval of large data sets. Researchers from the University of Pennsylvania, the University of Chicago and Lockheed Martin Energy Systems/Oak Ridge National Laboratory have developed a framework for transmission of large-scale medical images over high-speed networks, leveraged existing high-speed networks between research and medical facilities; tested the feasibility of point-to-point transmission of mammographic images in a near-real time environment; evaluated network performance and transmission scenarios; and investigated the impact of image preprocessing on an experimental computer-aided diagnosis system. Results of the initial study are reported here.
Third party EPID with IGRT capability retrofitted onto an existing medical linear accelerator
Odero, DO; Shimm, DS
2009-01-01
Radiation therapy requires precision to avoid unintended irradiation of normal organs. Electronic Portal Imaging Devices (EPIDs), can help with precise patient positioning for accurate treatment. EPIDs are now bundled with new linear accelerators, or they can be purchased from the Linac manufacturer for retrofit. Retrofitting a third party EPID to a linear accelerator can pose challenges. The authors describe a relatively inexpensive third party CCD camera-based EPID manufactured by TheraView (Cablon Medical B.V.), installed onto a Siemens Primus linear accelerator, and integrated with a Lantis record and verify system, an Oldelft simulator with Digital Therapy Imaging (DTI) unit, and a Philips ADAC Pinnacle treatment planning system (TPS). This system integrates well with existing equipment and its software can process DICOM images from other sources. The system provides a complete imaging system that eliminates the need for separate software for portal image viewing, interpretation, analysis, archiving, image guided radiation therapy and other image management applications. It can also be accessed remotely via safe VPN tunnels. TheraView EPID retrofit therefore presents an example of a less expensive alternative to linear accelerator manufacturers’ proprietary EPIDs suitable for implementation in third world countries radiation therapy departments which are often faced with limited financial resources. PMID:21611056
Third party EPID with IGRT capability retrofitted onto an existing medical linear accelerator.
Odero, D O; Shimm, D S
2009-07-01
Radiation therapy requires precision to avoid unintended irradiation of normal organs. Electronic Portal Imaging Devices (EPIDs), can help with precise patient positioning for accurate treatment. EPIDs are now bundled with new linear accelerators, or they can be purchased from the Linac manufacturer for retrofit. Retrofitting a third party EPID to a linear accelerator can pose challenges. The authors describe a relatively inexpensive third party CCD camera-based EPID manufactured by TheraView (Cablon Medical B.V.), installed onto a Siemens Primus linear accelerator, and integrated with a Lantis record and verify system, an Oldelft simulator with Digital Therapy Imaging (DTI) unit, and a Philips ADAC Pinnacle treatment planning system (TPS). This system integrates well with existing equipment and its software can process DICOM images from other sources. The system provides a complete imaging system that eliminates the need for separate software for portal image viewing, interpretation, analysis, archiving, image guided radiation therapy and other image management applications. It can also be accessed remotely via safe VPN tunnels. TheraView EPID retrofit therefore presents an example of a less expensive alternative to linear accelerator manufacturers' proprietary EPIDs suitable for implementation in third world countries radiation therapy departments which are often faced with limited financial resources.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo
2014-05-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, M. N.; Stephens, A.; da Costa, E. D.
2013-12-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.